供您使用的JavaScript辅助函数
有一个函数可以异步从某个地方获取一些值吗?考虑将其封装在这个函数中,使其成为一个超级函数。
以下是它为所有被封装的函数所做的事情。在我的经验中,这些功能非常有用,并且为您提供了一个可以进一步扩展的地方(例如,透明地处理批量请求等):
- **记忆化异步获取器**:使用相同的参数调用函数时,它会返回缓存的结果,而不是重新计算。
- **处理进行中的去重**:如果应用的多个部分在同一时间调用相同的获取器,而它仍在工作中,则只会发送一个请求。其余的请求将等待同一个 Promise。
- **限制并发**:您可以限制同时运行的获取器调用数量。这对于 API、磁盘 I/O 或任何速率敏感的操作非常有用。
- **支持自定义缓存后端**:传入任何具有 get、set、delete 和 has 方法的对象。可以与 Map、LRU 或您自己的缓存逻辑一起使用。
- **可选的 LRU 驱逐**:如果您传入一个普通的 Map,它会将其升级为具有最大大小的 LRU。当缓存满时,最少使用的项目将被驱逐。
- **处理回调和 Promise**:封装传统的回调式异步函数,但提供现代的基于 Promise 的接口。
- **智能键生成**:通过将非函数参数字符串化来构建缓存键。对于大多数日常用例效果良好。
- **支持手动驱逐**:调用 getter.forget(...args) 来移除特定条目,或调用 getter.force(...args) 来绕过缓存进行一次调用。
- **允许自定义准备逻辑**:您可以传入一个 prepare() 函数,在使用缓存结果之前对其进行克隆或处理。
```javascript
function createGetter(fn, {
cache = new Map(),
cacheSize = 100, // 仅在缓存为 Map 时使用
throttleSize = Infinity,
prepare,
callbackIndex,
resolveWithFirstArgument = false
} = {}) {
const inFlight = new Map();
let activeCount = 0;
const queue = [];
// 如果需要,将 Map 包装为简单的 LRU
if (cache instanceof Map) {
const rawMap = cache;
const lru = new Map();
cache = {
get(key) {
if (!rawMap.has(key)) return undefined;
const value = rawMap.get(key);
lru.delete(key);
lru.set(key, true); // 标记为最近使用
return value;
},
set(key, value) {
rawMap.set(key, value);
lru.set(key, true);
if (rawMap.size > cacheSize) {
const oldest = lru.keys().next().value;
rawMap.delete(oldest);
lru.delete(oldest);
}
},
delete(key) {
rawMap.delete(key);
lru.delete(key);
},
has(key) {
return rawMap.has(key);
}
};
}
function makeKey(args) {
return JSON.stringify(args.map(arg => (typeof arg === 'function' ? 'ƒ' : arg)));
}
function execute(context, args, key, resolve, reject) {
const callback = (err, result) => {
if (err) return reject(err);
cache.set(key, [context, arguments]);
if (prepare) prepare.call(null, context, arguments);
resolve(resolveWithFirstArgument && context !== undefined ? context : result);
processNext();
};
if (callbackIndex != null) args.splice(callbackIndex, 0, callback);
else args.push(callback);
if (fn.apply(context, args) === false) {
cache.delete(key); // 选择不使用缓存
}
}
function processNext() {
activeCount--;
if (queue.length && activeCount < throttleSize) {
const next = queue.shift();
activeCount++;
execute(...next);
}
}
const getter = function (...args) {
return new Promise((resolve, reject) => {
const context = this;
const key = makeKey(args);
if (cache.has(key)) {
const [cachedContext, cachedArgs] = cache.get(key);
if (prepare) prepare.call(null, cachedContext, cachedArgs);
return resolve(resolveWithFirstArgument && cachedContext !== undefined ? cachedContext : cachedArgs[1]);
}
if (inFlight.has(key)) {
return inFlight.get(key).then(resolve, reject);
}
const promise = new Promise((res, rej) => {
if (activeCount < throttleSize) {
activeCount++;
execute(context, args.slice(), key, res, rej);
} else {
queue.push([context, args.slice(), key, res, rej]);
}
});
inFlight.set(key, promise);
promise.finally(() => {
inFlight.delete(key);
});
promise.then(resolve, reject);
});
};
getter.forget = (...args) => {
const key = makeKey(args);
inFlight.delete(key);
return cache.delete(key);
};
getter.force = function (...args) {
getter.forget(...args);
return getter.apply(this, args);
};
return getter;
}
```
查看原文
Got a function that fetches some values asynchronously from somewhere? Consider wrapping it in this and making it a super-function.<p>Here is what it does for all the functions you wrap with it. In my experience, these are very helpful and also gives you a place you can even hook into and add more later (such as handling batching transparencly, etc):<p>Memoizes async getters: Call a function with the same arguments and it returns the cached result instead of recomputing.<p>Handles in-flight deduping: If multiple parts of your app call the same getter while it's still working, only one request is sent. The rest wait on the same promise.<p>Throttles concurrency: You can limit how many calls to your getter run in parallel. Useful for APIs, disk I/O, or anything rate-sensitive.<p>Supports custom caching backends: Pass any object with get, set, delete, and has. Works with Map, LRU, or your own cache logic.<p>Optional LRU eviction: If you pass a plain Map, it upgrades it to an LRU with a max size. Least recently used items are evicted when full.<p>Handles callbacks and Promises: Wraps traditional callback-style async functions, but gives you a modern Promise-based interface.<p>Smart-ish keying: Builds a cache key by stringifying non-function arguments. Works well for most everyday use cases.<p>Supports manual eviction: Call getter.forget(...args) to remove specific entries or getter.force(...args) to bypass the cache for one call.<p>Allows custom preparation logic: You can pass a prepare() function to clone or process cached results before using them.<p><pre><code> function createGetter(fn, {
cache = new Map(),
cacheSize = 100, // Used only if cache is a Map
throttleSize = Infinity,
prepare,
callbackIndex,
resolveWithFirstArgument = false
} = {}) {
const inFlight = new Map();
let activeCount = 0;
const queue = [];
// Wrap Map in a simple LRU if needed
if (cache instanceof Map) {
const rawMap = cache;
const lru = new Map();
cache = {
get(key) {
if (!rawMap.has(key)) return undefined;
const value = rawMap.get(key);
lru.delete(key);
lru.set(key, true); // Mark as most recently used
return value;
},
set(key, value) {
rawMap.set(key, value);
lru.set(key, true);
if (rawMap.size > cacheSize) {
const oldest = lru.keys().next().value;
rawMap.delete(oldest);
lru.delete(oldest);
}
},
delete(key) {
rawMap.delete(key);
lru.delete(key);
},
has(key) {
return rawMap.has(key);
}
};
}
function makeKey(args) {
return JSON.stringify(args.map(arg => (typeof arg === 'function' ? 'ƒ' : arg)));
}
function execute(context, args, key, resolve, reject) {
const callback = (err, result) => {
if (err) return reject(err);
cache.set(key, [context, arguments]);
if (prepare) prepare.call(null, context, arguments);
resolve(resolveWithFirstArgument && context !== undefined ? context : result);
processNext();
};
if (callbackIndex != null) args.splice(callbackIndex, 0, callback);
else args.push(callback);
if (fn.apply(context, args) === false) {
cache.delete(key); // opt-out of cache
}
}
function processNext() {
activeCount--;
if (queue.length && activeCount < throttleSize) {
const next = queue.shift();
activeCount++;
execute(...next);
}
}
const getter = function (...args) {
return new Promise((resolve, reject) => {
const context = this;
const key = makeKey(args);
if (cache.has(key)) {
const [cachedContext, cachedArgs] = cache.get(key);
if (prepare) prepare.call(null, cachedContext, cachedArgs);
return resolve(resolveWithFirstArgument && cachedContext !== undefined ? cachedContext : cachedArgs[1]);
}
if (inFlight.has(key)) {
return inFlight.get(key).then(resolve, reject);
}
const promise = new Promise((res, rej) => {
if (activeCount < throttleSize) {
activeCount++;
execute(context, args.slice(), key, res, rej);
} else {
queue.push([context, args.slice(), key, res, rej]);
}
});
inFlight.set(key, promise);
promise.finally(() => {
inFlight.delete(key);
});
promise.then(resolve, reject);
});
};
getter.forget = (...args) => {
const key = makeKey(args);
inFlight.delete(key);
return cache.delete(key);
};
getter.force = function (...args) {
getter.forget(...args);
return getter.apply(this, args);
};
return getter;
}</code></pre>