问题描述
我正在寻找一种设置python的 hash()
salt的方法来调用函数。在文档中,我只找到,它设置salt all 调用 hash()
。
但是,我需要 hash
在被特定对象调用时始终获得相同的结果,但我不想强制整个应用程序使用相同的结果(可预测的)salt。
上下文:在 python2
中,I使用 hash
将键值对象排序到索引桶中。铲斗永久存储。这是相反的取值。基本上,对于每一对我做的
$ b $ pre $ class PDict(object):
def __init __(self,bucket_count,bucket_store_path) :
self._path,self.bucket_count = \
self._fetch_or_store_metadata(bucket_store_path,bucket_count)
$ b $ def __setitem __(self,key,value):
bucket_index =(hash(key)& 0xffffffff)%self.bucket_count
self.buckets [bucket_index] [key] = value
self._store_bucket(bucket_index)
def __getitem __( self,key):
bucket_index =(hash(key)& 0xffffffff)%self.bucket_count
return self._fetch_bucket(bucket_index)[key]
这要求 hash
总能为每个实例获得相同的结果,通过解释器调用。
import hashlib
def getHash(name):
m = hashlib.md5()
m.update(name)
return m.hexdigest()
I'm looking for a way to set python's hash()
salt for individual calls to the function. In the docs, I've only found PYTHONHASHSEED which sets the salt for all calls to hash()
.However, I need hash
to always get me the same result when called by specific objects, but I don't want to force the entire application to use the same (predictable) salt.
Context: In python2
, I'm using hash
to sort key-value object pairs into indexed buckets. Buckets are stored persistently. This is reversed to fetch the value. Basically, for every pair I do
class PDict(object):
def __init__(self, bucket_count, bucket_store_path):
self._path, self.bucket_count = \
self._fetch_or_store_metadata(bucket_store_path, bucket_count)
def __setitem__(self, key, value):
bucket_index = (hash(key)&0xffffffff) % self.bucket_count
self.buckets[bucket_index][key] = value
self._store_bucket(bucket_index)
def __getitem__(self, key):
bucket_index = (hash(key)&0xffffffff) % self.bucket_count
return self._fetch_bucket(bucket_index)[key]
This requires hash
to always get me the same result per instance, across interpreter invocation.
import hashlib
def getHash(name):
m = hashlib.md5()
m.update(name)
return m.hexdigest()
这篇关于为个别呼叫设置哈希盐的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!