Initial import with skill sheet working

This commit is contained in:
2024-12-04 00:11:23 +01:00
commit 9050c80ab4
4488 changed files with 671048 additions and 0 deletions

64
node_modules/@seald-io/binary-search-tree/CHANGELOG.md generated vendored Normal file
View File

@ -0,0 +1,64 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres
to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.0.3] - 2023-01-19
### Changed
- Correctly log object keys [#1](https://github.com/seald/node-binary-search-tree/pull/1)
## [1.0.2] - 2021-05-19
### Changed
- Specify files to be included in published version, v1.0.1 contained cache,
which contains, a now revoked, npm authentication token.
## [1.0.1] - 2021-05-19
### Added
- Added a changelog.
### Changed
- Removed `underscore` dependency which was used only in the tests.
## [1.0.0] - 2021-05-17
This version should be a drop-in replacement for `binary-search-tree@0.2.6`
provided you use modern browsers / versions of Node.js since ES6 features are
now used (such as `class` and `const` / `let`).
### Changed
- Update `homepage` & `repository` fields in the `package.json`
- New maintainer [seald](https://github.com/seald/) and new package
name [@seald-io/binary-search-tree](https://www.npmjs.com/package/@seald-io/binary-search-tree)
;
- Added `lockfileVersion: 2` `package-lock.json`;
- Modernized some of the code with ES6 features (`class`, `const` & `let`);
- Uses [`standard`](https://standardjs.com/) to lint the code (which removes all
unnecessary semicolons);
- Updated dependencies;
### Removed
- Compatibility with old browsers and old version of Node.js that don't support
ES6 features.
### Security
- This version no longer
uses [a vulnerable version of `underscore`](https://github.com/advisories/GHSA-cf4h-3jhx-xvhq)
.
## [0.2.6] - 2016-02-28
See [original repo](https://github.com/louischatriot/node-binary-search-tree)

19
node_modules/@seald-io/binary-search-tree/LICENSE.md generated vendored Normal file
View File

@ -0,0 +1,19 @@
Copyright (c) 2013 Louis Chatriot <louis.chatriot@gmail.com>
Copyright (c) 2021 Seald [contact@seald.io](mailto:contact@seald.io);
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including without
limitation the rights to use, copy, modify, merge, publish, distribute,
sublicense, and/or sell copies of the Software, and to permit persons to whom
the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

130
node_modules/@seald-io/binary-search-tree/README.md generated vendored Normal file
View File

@ -0,0 +1,130 @@
# Binary search trees for Node.js
This module is a fork
of [node-binary-search-tree](https://github.com/louischatriot/node-binary-search-tree)
written by Louis Chatriot for storing indexes
in [nedb](https://github.com/louischatriot/nedb).
Since the original maintainer doesn't support these packages anymore, we forked
them (here is [nedb](https://github.com/seald/nedb)) and maintain them for the
needs of [Seald](https://www.seald.io).
Two implementations of binary search
tree: [basic](http://en.wikipedia.org/wiki/Binary_search_tree)
and [AVL](http://en.wikipedia.org/wiki/AVL_tree) (a kind of self-balancing
binmary search tree).
## Installation and tests
Package name is `@seald-io/binary-search-tree`.
```bash
npm install @seald-io/binary-search-tree
```
If you want to run the tests, you'll have to clone the repository:
```bash
git clone https://github.com/seald/node-binary-search-tree
npm install
npm test
```
## Usage
The API mainly provides 3 functions: `insert`, `search` and `delete`. If you do
not create a unique-type binary search tree, you can store multiple pieces of
data for the same key. Doing so with a unique-type BST will result in an error
being thrown. Data is always returned as an array, and you can delete all data
relating to a given key, or just one piece of data.
Values inserted can be anything except `undefined`.
```javascript
const BinarySearchTree = require('binary-search-tree').BinarySearchTree
const AVLTree = require('binary-search-tree').AVLTree // Same API as BinarySearchTree
// Creating a binary search tree
const bst = new BinarySearchTree()
// Inserting some data
bst.insert(15, 'some data for key 15')
bst.insert(12, 'something else')
bst.insert(18, 'hello')
// You can insert multiple pieces of data for the same key
// if your tree doesn't enforce a unique constraint
bst.insert(18, 'world')
// Retrieving data (always returned as an array of all data stored for this key)
bst.search(15) // Equal to ['some data for key 15']
bst.search(18) // Equal to ['hello', 'world']
bst.search(1) // Equal to []
// Search between bounds with a MongoDB-like query
// Data is returned in key order
// Note the difference between $lt (less than) and $gte (less than OR EQUAL)
bst.betweenBounds({ $lt: 18, $gte: 12 }) // Equal to ['something else', 'some data for key 15']
// Deleting all the data relating to a key
bst.delete(15) // bst.search(15) will now give []
bst.delete(18, 'world') // bst.search(18) will now give ['hello']
```
There are three optional parameters you can pass the BST constructor, allowing
you to enforce a key-uniqueness constraint, use a custom function to compare
keys and use a custom function to check whether values are equal. These
parameters are all passed in an object.
### Uniqueness
```javascript
const bst = new BinarySearchTree({ unique: true });
bst.insert(10, 'hello');
bst.insert(10, 'world'); // Will throw an error
```
### Custom key comparison
```javascript
// Custom key comparison function
// It needs to return a negative number if a is less than b,
// a positive number if a is greater than b
// and 0 if they are equal
// If none is provided, the default one can compare numbers, dates and strings
// which are the most common usecases
const compareKeys = (a, b) => {
if (a.age < b.age) return -1
if (a.age > b.age) return 1
return 0
}
// Now we can use objects with an 'age' property as keys
const bst = new BinarySearchTree({ compareKeys })
bst.insert({ age: 23 }, 'Mark')
bst.insert({ age: 47 }, 'Franck')
```
### Custom value checking
```javascript
// Custom value equality checking function used when we try to just delete one piece of data
// Returns true if a and b are considered the same, false otherwise
// The default function is able to compare numbers and strings
const checkValueEquality = (a, b) => a.length === b.length
var bst = new BinarySearchTree({ checkValueEquality })
bst.insert(10, 'hello')
bst.insert(10, 'world')
bst.insert(10, 'howdoyoudo')
bst.delete(10, 'abcde')
bst.search(10) // Returns ['howdoyoudo']
```
## License
The package is released under the MIT License as the original package.
See LICENSE.md.

2
node_modules/@seald-io/binary-search-tree/index.js generated vendored Normal file
View File

@ -0,0 +1,2 @@
module.exports.BinarySearchTree = require('./lib/bst')
module.exports.AVLTree = require('./lib/avltree')

View File

@ -0,0 +1,412 @@
/**
* Self-balancing binary search tree using the AVL implementation
*/
const BinarySearchTree = require('./bst')
const customUtils = require('./customUtils')
class AVLTree {
/**
* Constructor
* We can't use a direct pointer to the root node (as in the simple binary search tree)
* as the root will change during tree rotations
* @param {Boolean} options.unique Whether to enforce a 'unique' constraint on the key or not
* @param {Function} options.compareKeys Initialize this BST's compareKeys
*/
constructor (options) {
this.tree = new _AVLTree(options)
}
checkIsAVLT () { this.tree.checkIsAVLT() }
// Insert in the internal tree, update the pointer to the root if needed
insert (key, value) {
const newTree = this.tree.insert(key, value)
// If newTree is undefined, that means its structure was not modified
if (newTree) { this.tree = newTree }
}
// Delete a value
delete (key, value) {
const newTree = this.tree.delete(key, value)
// If newTree is undefined, that means its structure was not modified
if (newTree) { this.tree = newTree }
}
}
class _AVLTree extends BinarySearchTree {
/**
* Constructor of the internal AVLTree
* @param {Object} options Optional
* @param {Boolean} options.unique Whether to enforce a 'unique' constraint on the key or not
* @param {Key} options.key Initialize this BST's key with key
* @param {Value} options.value Initialize this BST's data with [value]
* @param {Function} options.compareKeys Initialize this BST's compareKeys
*/
constructor (options) {
super()
options = options || {}
this.left = null
this.right = null
this.parent = options.parent !== undefined ? options.parent : null
if (Object.prototype.hasOwnProperty.call(options, 'key')) this.key = options.key
this.data = Object.prototype.hasOwnProperty.call(options, 'value') ? [options.value] : []
this.unique = options.unique || false
this.compareKeys = options.compareKeys || customUtils.defaultCompareKeysFunction
this.checkValueEquality = options.checkValueEquality || customUtils.defaultCheckValueEquality
}
/**
* Check the recorded height is correct for every node
* Throws if one height doesn't match
*/
checkHeightCorrect () {
if (!Object.prototype.hasOwnProperty.call(this, 'key')) { return } // Empty tree
if (this.left && this.left.height === undefined) { throw new Error('Undefined height for node ' + this.left.key) }
if (this.right && this.right.height === undefined) { throw new Error('Undefined height for node ' + this.right.key) }
if (this.height === undefined) { throw new Error('Undefined height for node ' + this.key) }
const leftH = this.left ? this.left.height : 0
const rightH = this.right ? this.right.height : 0
if (this.height !== 1 + Math.max(leftH, rightH)) { throw new Error('Height constraint failed for node ' + this.key) }
if (this.left) { this.left.checkHeightCorrect() }
if (this.right) { this.right.checkHeightCorrect() }
}
/**
* Return the balance factor
*/
balanceFactor () {
const leftH = this.left ? this.left.height : 0
const rightH = this.right ? this.right.height : 0
return leftH - rightH
}
/**
* Check that the balance factors are all between -1 and 1
*/
checkBalanceFactors () {
if (Math.abs(this.balanceFactor()) > 1) { throw new Error('Tree is unbalanced at node ' + this.key) }
if (this.left) { this.left.checkBalanceFactors() }
if (this.right) { this.right.checkBalanceFactors() }
}
/**
* When checking if the BST conditions are met, also check that the heights are correct
* and the tree is balanced
*/
checkIsAVLT () {
super.checkIsBST()
this.checkHeightCorrect()
this.checkBalanceFactors()
}
/**
* Perform a right rotation of the tree if possible
* and return the root of the resulting tree
* The resulting tree's nodes' heights are also updated
*/
rightRotation () {
const q = this
const p = this.left
if (!p) return q // No change
const b = p.right
// Alter tree structure
if (q.parent) {
p.parent = q.parent
if (q.parent.left === q) q.parent.left = p
else q.parent.right = p
} else {
p.parent = null
}
p.right = q
q.parent = p
q.left = b
if (b) { b.parent = q }
// Update heights
const ah = p.left ? p.left.height : 0
const bh = b ? b.height : 0
const ch = q.right ? q.right.height : 0
q.height = Math.max(bh, ch) + 1
p.height = Math.max(ah, q.height) + 1
return p
}
/**
* Perform a left rotation of the tree if possible
* and return the root of the resulting tree
* The resulting tree's nodes' heights are also updated
*/
leftRotation () {
const p = this
const q = this.right
if (!q) { return this } // No change
const b = q.left
// Alter tree structure
if (p.parent) {
q.parent = p.parent
if (p.parent.left === p) p.parent.left = q
else p.parent.right = q
} else {
q.parent = null
}
q.left = p
p.parent = q
p.right = b
if (b) { b.parent = p }
// Update heights
const ah = p.left ? p.left.height : 0
const bh = b ? b.height : 0
const ch = q.right ? q.right.height : 0
p.height = Math.max(ah, bh) + 1
q.height = Math.max(ch, p.height) + 1
return q
}
/**
* Modify the tree if its right subtree is too small compared to the left
* Return the new root if any
*/
rightTooSmall () {
if (this.balanceFactor() <= 1) return this // Right is not too small, don't change
if (this.left.balanceFactor() < 0) this.left.leftRotation()
return this.rightRotation()
}
/**
* Modify the tree if its left subtree is too small compared to the right
* Return the new root if any
*/
leftTooSmall () {
if (this.balanceFactor() >= -1) { return this } // Left is not too small, don't change
if (this.right.balanceFactor() > 0) this.right.rightRotation()
return this.leftRotation()
}
/**
* Rebalance the tree along the given path. The path is given reversed (as he was calculated
* in the insert and delete functions).
* Returns the new root of the tree
* Of course, the first element of the path must be the root of the tree
*/
rebalanceAlongPath (path) {
let newRoot = this
let rotated
let i
if (!Object.prototype.hasOwnProperty.call(this, 'key')) {
delete this.height
return this
} // Empty tree
// Rebalance the tree and update all heights
for (i = path.length - 1; i >= 0; i -= 1) {
path[i].height = 1 + Math.max(path[i].left ? path[i].left.height : 0, path[i].right ? path[i].right.height : 0)
if (path[i].balanceFactor() > 1) {
rotated = path[i].rightTooSmall()
if (i === 0) newRoot = rotated
}
if (path[i].balanceFactor() < -1) {
rotated = path[i].leftTooSmall()
if (i === 0) newRoot = rotated
}
}
return newRoot
}
/**
* Insert a key, value pair in the tree while maintaining the AVL tree height constraint
* Return a pointer to the root node, which may have changed
*/
insert (key, value) {
const insertPath = []
let currentNode = this
// Empty tree, insert as root
if (!Object.prototype.hasOwnProperty.call(this, 'key')) {
this.key = key
this.data.push(value)
this.height = 1
return this
}
// Insert new leaf at the right place
while (true) {
// Same key: no change in the tree structure
if (currentNode.compareKeys(currentNode.key, key) === 0) {
if (currentNode.unique) {
const err = new Error(`Can't insert key ${JSON.stringify(key)}, it violates the unique constraint`)
err.key = key
err.errorType = 'uniqueViolated'
throw err
} else currentNode.data.push(value)
return this
}
insertPath.push(currentNode)
if (currentNode.compareKeys(key, currentNode.key) < 0) {
if (!currentNode.left) {
insertPath.push(currentNode.createLeftChild({ key: key, value: value }))
break
} else currentNode = currentNode.left
} else {
if (!currentNode.right) {
insertPath.push(currentNode.createRightChild({ key: key, value: value }))
break
} else currentNode = currentNode.right
}
}
return this.rebalanceAlongPath(insertPath)
}
/**
* Delete a key or just a value and return the new root of the tree
* @param {Key} key
* @param {Value} value Optional. If not set, the whole key is deleted. If set, only this value is deleted
*/
delete (key, value) {
const newData = []
let replaceWith
let currentNode = this
const deletePath = []
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return this // Empty tree
// Either no match is found and the function will return from within the loop
// Or a match is found and deletePath will contain the path from the root to the node to delete after the loop
while (true) {
if (currentNode.compareKeys(key, currentNode.key) === 0) { break }
deletePath.push(currentNode)
if (currentNode.compareKeys(key, currentNode.key) < 0) {
if (currentNode.left) {
currentNode = currentNode.left
} else return this // Key not found, no modification
} else {
// currentNode.compareKeys(key, currentNode.key) is > 0
if (currentNode.right) {
currentNode = currentNode.right
} else return this // Key not found, no modification
}
}
// Delete only a value (no tree modification)
if (currentNode.data.length > 1 && value !== undefined) {
currentNode.data.forEach(function (d) {
if (!currentNode.checkValueEquality(d, value)) newData.push(d)
})
currentNode.data = newData
return this
}
// Delete a whole node
// Leaf
if (!currentNode.left && !currentNode.right) {
if (currentNode === this) { // This leaf is also the root
delete currentNode.key
currentNode.data = []
delete currentNode.height
return this
} else {
if (currentNode.parent.left === currentNode) currentNode.parent.left = null
else currentNode.parent.right = null
return this.rebalanceAlongPath(deletePath)
}
}
// Node with only one child
if (!currentNode.left || !currentNode.right) {
replaceWith = currentNode.left ? currentNode.left : currentNode.right
if (currentNode === this) { // This node is also the root
replaceWith.parent = null
return replaceWith // height of replaceWith is necessarily 1 because the tree was balanced before deletion
} else {
if (currentNode.parent.left === currentNode) {
currentNode.parent.left = replaceWith
replaceWith.parent = currentNode.parent
} else {
currentNode.parent.right = replaceWith
replaceWith.parent = currentNode.parent
}
return this.rebalanceAlongPath(deletePath)
}
}
// Node with two children
// Use the in-order predecessor (no need to randomize since we actively rebalance)
deletePath.push(currentNode)
replaceWith = currentNode.left
// Special case: the in-order predecessor is right below the node to delete
if (!replaceWith.right) {
currentNode.key = replaceWith.key
currentNode.data = replaceWith.data
currentNode.left = replaceWith.left
if (replaceWith.left) { replaceWith.left.parent = currentNode }
return this.rebalanceAlongPath(deletePath)
}
// After this loop, replaceWith is the right-most leaf in the left subtree
// and deletePath the path from the root (inclusive) to replaceWith (exclusive)
while (true) {
if (replaceWith.right) {
deletePath.push(replaceWith)
replaceWith = replaceWith.right
} else break
}
currentNode.key = replaceWith.key
currentNode.data = replaceWith.data
replaceWith.parent.right = replaceWith.left
if (replaceWith.left) replaceWith.left.parent = replaceWith.parent
return this.rebalanceAlongPath(deletePath)
}
}
/**
* Keep a pointer to the internal tree constructor for testing purposes
*/
AVLTree._AVLTree = _AVLTree;
/**
* Other functions we want to use on an AVLTree as if it were the internal _AVLTree
*/
['getNumberOfKeys', 'search', 'betweenBounds', 'prettyPrint', 'executeOnEveryNode'].forEach(function (fn) {
AVLTree.prototype[fn] = function () {
return this.tree[fn].apply(this.tree, arguments)
}
})
// Interface
module.exports = AVLTree

452
node_modules/@seald-io/binary-search-tree/lib/bst.js generated vendored Normal file
View File

@ -0,0 +1,452 @@
/**
* Simple binary search tree
*/
const customUtils = require('./customUtils')
class BinarySearchTree {
/**
* Constructor
* @param {Object} options Optional
* @param {Boolean} options.unique Whether to enforce a 'unique' constraint on the key or not
* @param {Key} options.key Initialize this BST's key with key
* @param {Value} options.value Initialize this BST's data with [value]
* @param {Function} options.compareKeys Initialize this BST's compareKeys
*/
constructor (options) {
options = options || {}
this.left = null
this.right = null
this.parent = options.parent !== undefined ? options.parent : null
if (Object.prototype.hasOwnProperty.call(options, 'key')) { this.key = options.key }
this.data = Object.prototype.hasOwnProperty.call(options, 'value') ? [options.value] : []
this.unique = options.unique || false
this.compareKeys = options.compareKeys || customUtils.defaultCompareKeysFunction
this.checkValueEquality = options.checkValueEquality || customUtils.defaultCheckValueEquality
}
/**
* Get the descendant with max key
*/
getMaxKeyDescendant () {
if (this.right) return this.right.getMaxKeyDescendant()
else return this
}
/**
* Get the maximum key
*/
getMaxKey () {
return this.getMaxKeyDescendant().key
}
/**
* Get the descendant with min key
*/
getMinKeyDescendant () {
if (this.left) return this.left.getMinKeyDescendant()
else return this
}
/**
* Get the minimum key
*/
getMinKey () {
return this.getMinKeyDescendant().key
}
/**
* Check that all nodes (incl. leaves) fullfil condition given by fn
* test is a function passed every (key, data) and which throws if the condition is not met
*/
checkAllNodesFullfillCondition (test) {
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return
test(this.key, this.data)
if (this.left) this.left.checkAllNodesFullfillCondition(test)
if (this.right) this.right.checkAllNodesFullfillCondition(test)
}
/**
* Check that the core BST properties on node ordering are verified
* Throw if they aren't
*/
checkNodeOrdering () {
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return
if (this.left) {
this.left.checkAllNodesFullfillCondition(k => {
if (this.compareKeys(k, this.key) >= 0) throw new Error(`Tree with root ${this.key} is not a binary search tree`)
})
this.left.checkNodeOrdering()
}
if (this.right) {
this.right.checkAllNodesFullfillCondition(k => {
if (this.compareKeys(k, this.key) <= 0) throw new Error(`Tree with root ${this.key} is not a binary search tree`)
})
this.right.checkNodeOrdering()
}
}
/**
* Check that all pointers are coherent in this tree
*/
checkInternalPointers () {
if (this.left) {
if (this.left.parent !== this) throw new Error(`Parent pointer broken for key ${this.key}`)
this.left.checkInternalPointers()
}
if (this.right) {
if (this.right.parent !== this) throw new Error(`Parent pointer broken for key ${this.key}`)
this.right.checkInternalPointers()
}
}
/**
* Check that a tree is a BST as defined here (node ordering and pointer references)
*/
checkIsBST () {
this.checkNodeOrdering()
this.checkInternalPointers()
if (this.parent) throw new Error("The root shouldn't have a parent")
}
/**
* Get number of keys inserted
*/
getNumberOfKeys () {
let res
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return 0
res = 1
if (this.left) res += this.left.getNumberOfKeys()
if (this.right) res += this.right.getNumberOfKeys()
return res
}
/**
* Create a BST similar (i.e. same options except for key and value) to the current one
* Use the same constructor (i.e. BinarySearchTree, AVLTree etc)
* @param {Object} options see constructor
*/
createSimilar (options) {
options = options || {}
options.unique = this.unique
options.compareKeys = this.compareKeys
options.checkValueEquality = this.checkValueEquality
return new this.constructor(options)
}
/**
* Create the left child of this BST and return it
*/
createLeftChild (options) {
const leftChild = this.createSimilar(options)
leftChild.parent = this
this.left = leftChild
return leftChild
}
/**
* Create the right child of this BST and return it
*/
createRightChild (options) {
const rightChild = this.createSimilar(options)
rightChild.parent = this
this.right = rightChild
return rightChild
}
/**
* Insert a new element
*/
insert (key, value) {
// Empty tree, insert as root
if (!Object.prototype.hasOwnProperty.call(this, 'key')) {
this.key = key
this.data.push(value)
return
}
// Same key as root
if (this.compareKeys(this.key, key) === 0) {
if (this.unique) {
const err = new Error(`Can't insert key ${JSON.stringify(key)}, it violates the unique constraint`)
err.key = key
err.errorType = 'uniqueViolated'
throw err
} else this.data.push(value)
return
}
if (this.compareKeys(key, this.key) < 0) {
// Insert in left subtree
if (this.left) this.left.insert(key, value)
else this.createLeftChild({ key: key, value: value })
} else {
// Insert in right subtree
if (this.right) this.right.insert(key, value)
else this.createRightChild({ key: key, value: value })
}
}
/**
* Search for all data corresponding to a key
*/
search (key) {
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return []
if (this.compareKeys(this.key, key) === 0) return this.data
if (this.compareKeys(key, this.key) < 0) {
if (this.left) return this.left.search(key)
else return []
} else {
if (this.right) return this.right.search(key)
else return []
}
}
/**
* Return a function that tells whether a given key matches a lower bound
*/
getLowerBoundMatcher (query) {
// No lower bound
if (!Object.prototype.hasOwnProperty.call(query, '$gt') && !Object.prototype.hasOwnProperty.call(query, '$gte')) return () => true
if (Object.prototype.hasOwnProperty.call(query, '$gt') && Object.prototype.hasOwnProperty.call(query, '$gte')) {
if (this.compareKeys(query.$gte, query.$gt) === 0) return key => this.compareKeys(key, query.$gt) > 0
if (this.compareKeys(query.$gte, query.$gt) > 0) return key => this.compareKeys(key, query.$gte) >= 0
else return key => this.compareKeys(key, query.$gt) > 0
}
if (Object.prototype.hasOwnProperty.call(query, '$gt')) return key => this.compareKeys(key, query.$gt) > 0
else return key => this.compareKeys(key, query.$gte) >= 0
}
/**
* Return a function that tells whether a given key matches an upper bound
*/
getUpperBoundMatcher (query) {
// No lower bound
if (!Object.prototype.hasOwnProperty.call(query, '$lt') && !Object.prototype.hasOwnProperty.call(query, '$lte')) return () => true
if (Object.prototype.hasOwnProperty.call(query, '$lt') && Object.prototype.hasOwnProperty.call(query, '$lte')) {
if (this.compareKeys(query.$lte, query.$lt) === 0) return key => this.compareKeys(key, query.$lt) < 0
if (this.compareKeys(query.$lte, query.$lt) < 0) return key => this.compareKeys(key, query.$lte) <= 0
else return key => this.compareKeys(key, query.$lt) < 0
}
if (Object.prototype.hasOwnProperty.call(query, '$lt')) return key => this.compareKeys(key, query.$lt) < 0
else return key => this.compareKeys(key, query.$lte) <= 0
}
/**
* Get all data for a key between bounds
* Return it in key order
* @param {Object} query Mongo-style query where keys are $lt, $lte, $gt or $gte (other keys are not considered)
* @param {Functions} lbm/ubm matching functions calculated at the first recursive step
*/
betweenBounds (query, lbm, ubm) {
const res = []
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return [] // Empty tree
lbm = lbm || this.getLowerBoundMatcher(query)
ubm = ubm || this.getUpperBoundMatcher(query)
if (lbm(this.key) && this.left) append(res, this.left.betweenBounds(query, lbm, ubm))
if (lbm(this.key) && ubm(this.key)) append(res, this.data)
if (ubm(this.key) && this.right) append(res, this.right.betweenBounds(query, lbm, ubm))
return res
}
/**
* Delete the current node if it is a leaf
* Return true if it was deleted
*/
deleteIfLeaf () {
if (this.left || this.right) return false
// The leaf is itself a root
if (!this.parent) {
delete this.key
this.data = []
return true
}
if (this.parent.left === this) this.parent.left = null
else this.parent.right = null
return true
}
/**
* Delete the current node if it has only one child
* Return true if it was deleted
*/
deleteIfOnlyOneChild () {
let child
if (this.left && !this.right) child = this.left
if (!this.left && this.right) child = this.right
if (!child) return false
// Root
if (!this.parent) {
this.key = child.key
this.data = child.data
this.left = null
if (child.left) {
this.left = child.left
child.left.parent = this
}
this.right = null
if (child.right) {
this.right = child.right
child.right.parent = this
}
return true
}
if (this.parent.left === this) {
this.parent.left = child
child.parent = this.parent
} else {
this.parent.right = child
child.parent = this.parent
}
return true
}
/**
* Delete a key or just a value
* @param {Key} key
* @param {Value} value Optional. If not set, the whole key is deleted. If set, only this value is deleted
*/
delete (key, value) {
const newData = []
let replaceWith
if (!Object.prototype.hasOwnProperty.call(this, 'key')) return
if (this.compareKeys(key, this.key) < 0) {
if (this.left) this.left.delete(key, value)
return
}
if (this.compareKeys(key, this.key) > 0) {
if (this.right) this.right.delete(key, value)
return
}
if (!this.compareKeys(key, this.key) === 0) return
// Delete only a value
if (this.data.length > 1 && value !== undefined) {
this.data.forEach(d => {
if (!this.checkValueEquality(d, value)) newData.push(d)
})
this.data = newData
return
}
// Delete the whole node
if (this.deleteIfLeaf()) return
if (this.deleteIfOnlyOneChild()) return
// We are in the case where the node to delete has two children
if (Math.random() >= 0.5) { // Randomize replacement to avoid unbalancing the tree too much
// Use the in-order predecessor
replaceWith = this.left.getMaxKeyDescendant()
this.key = replaceWith.key
this.data = replaceWith.data
if (this === replaceWith.parent) { // Special case
this.left = replaceWith.left
if (replaceWith.left) replaceWith.left.parent = replaceWith.parent
} else {
replaceWith.parent.right = replaceWith.left
if (replaceWith.left) replaceWith.left.parent = replaceWith.parent
}
} else {
// Use the in-order successor
replaceWith = this.right.getMinKeyDescendant()
this.key = replaceWith.key
this.data = replaceWith.data
if (this === replaceWith.parent) { // Special case
this.right = replaceWith.right
if (replaceWith.right) replaceWith.right.parent = replaceWith.parent
} else {
replaceWith.parent.left = replaceWith.right
if (replaceWith.right) replaceWith.right.parent = replaceWith.parent
}
}
}
/**
* Execute a function on every node of the tree, in key order
* @param {Function} fn Signature: node. Most useful will probably be node.key and node.data
*/
executeOnEveryNode (fn) {
if (this.left) this.left.executeOnEveryNode(fn)
fn(this)
if (this.right) this.right.executeOnEveryNode(fn)
}
/**
* Pretty print a tree
* @param {Boolean} printData To print the nodes' data along with the key
*/
prettyPrint (printData, spacing) {
spacing = spacing || ''
console.log(`${spacing}* ${this.key}`)
if (printData) console.log(`${spacing}* ${this.data}`)
if (!this.left && !this.right) return
if (this.left) this.left.prettyPrint(printData, `${spacing} `)
else console.log(`${spacing} *`)
if (this.right) this.right.prettyPrint(printData, `${spacing} `)
else console.log(`${spacing} *`)
}
}
// ================================
// Methods used to test the tree
// ================================
// ============================================
// Methods used to actually work on the tree
// ============================================
// Append all elements in toAppend to array
function append (array, toAppend) {
for (let i = 0; i < toAppend.length; i += 1) {
array.push(toAppend[i])
}
}
// Interface
module.exports = BinarySearchTree

View File

@ -0,0 +1,38 @@
/**
* Return an array with the numbers from 0 to n-1, in a random order
*/
const getRandomArray = n => {
if (n === 0) return []
if (n === 1) return [0]
const res = getRandomArray(n - 1)
const next = Math.floor(Math.random() * n)
res.splice(next, 0, n - 1) // Add n-1 at a random position in the array
return res
}
module.exports.getRandomArray = getRandomArray
/*
* Default compareKeys function will work for numbers, strings and dates
*/
const defaultCompareKeysFunction = (a, b) => {
if (a < b) return -1
if (a > b) return 1
if (a === b) return 0
const err = new Error("Couldn't compare elements")
err.a = a
err.b = b
throw err
}
module.exports.defaultCompareKeysFunction = defaultCompareKeysFunction
/**
* Check whether two values are equal (used in non-unique deletion)
*/
const defaultCheckValueEquality = (a, b) => a === b
module.exports.defaultCheckValueEquality = defaultCheckValueEquality

51
node_modules/@seald-io/binary-search-tree/package.json generated vendored Normal file
View File

@ -0,0 +1,51 @@
{
"name": "@seald-io/binary-search-tree",
"version": "1.0.3",
"author": {
"name": "Timothée Rebours",
"email": "tim@seald.io",
"url": "https://www.seald.io/"
},
"files": [
"lib/**/*.js",
"index.js"
],
"contributors": [
{
"name": "Louis Chatriot",
"email": "louis.chatriot@gmail.com"
},
{
"name": "Timothée Rebours",
"email": "tim@seald.io",
"url": "https://www.seald.io/"
}
],
"description": "Different binary search tree implementations, including a self-balancing one (AVL)",
"keywords": [
"AVL tree",
"binary search tree",
"self-balancing",
"AVL tree"
],
"homepage": "https://github.com/seald/node-binary-search-tree",
"repository": {
"type": "git",
"url": "git@github.com:seald/node-binary-search-tree.git"
},
"devDependencies": {
"chai": "^4.3.4",
"mocha": "^8.4.0",
"semver": "^7.3.5",
"standard": "^16.0.3"
},
"scripts": {
"test": "mocha --reporter spec --timeout 10000",
"lint": "standard"
},
"main": "index.js",
"licence": "MIT",
"publishConfig": {
"access": "public"
}
}

19
node_modules/@seald-io/nedb/LICENSE.md generated vendored Normal file
View File

@ -0,0 +1,19 @@
Copyright (c) 2013 Louis Chatriot &lt;louis.chatriot@gmail.com&gt;
Copyright (c) 2021 Seald [contact@seald.io](mailto:contact@seald.io);
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including without
limitation the rights to use, copy, modify, merge, publish, distribute,
sublicense, and/or sell copies of the Software, and to permit persons to whom
the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

905
node_modules/@seald-io/nedb/README.md generated vendored Executable file
View File

@ -0,0 +1,905 @@
<img src="http://i.imgur.com/9O1xHFb.png" style="width: 25%; height: 25%; float: left;">
## The JavaScript Database
This module is a fork of [nedb](https://github.com/louischatriot/nedb)
written by Louis Chatriot.
Since the original maintainer doesn't support this package anymore, we forked it
and maintain it for the needs of [Seald](https://www.seald.io).
**Embedded persistent or in memory database for Node.js, Electron and browsers,
100% JavaScript, no binary dependency**. API is a subset of MongoDB's and it's
[plenty fast](#speed).
## Installation
Module name on npm is [`@seald-io/nedb`](https://www.npmjs.com/package/@seald-io/nedb).
```
npm install @seald-io/nedb
```
Then to import, you just have to:
```js
const Datastore = require('@seald-io/nedb')
```
## Documentation
The API is a subset of MongoDB's API (the most used operations).
Since version [3.0.0](./CHANGELOG.md#300---2022-03-16), NeDB provides a Promise-based equivalent for each function
which is suffixed with `Async`, for example `loadDatabaseAsync`.
The original callback-based interface is still available, fully retro-compatible
(as far as the test suites can tell) and are a shim to this Promise-based
version.
Don't hesitate to open an issue if it breaks something in your project.
The rest of the readme will only show the Promise-based API, the full
documentation is available in the [`API.md`](./API.md) file at the root of the
repository. It is generated by running `npm run generateDocs:markdown`.
* [Creating/loading a database](#creatingloading-a-database)
* [Dropping a database](#dropping-a-database)
* [Persistence](#persistence)
* [Inserting documents](#inserting-documents)
* [Finding documents](#finding-documents)
* [Basic Querying](#basic-querying)
* [Operators ($lt, $lte, $gt, $gte, $in, $nin, $ne, $exists, $regex)](#operators-lt-lte-gt-gte-in-nin-ne-stat-regex)
* [Array fields](#array-fields)
* [Logical operators $or, $and, $not, $where](#logical-operators-or-and-not-where)
* [Sorting and paginating](#sorting-and-paginating)
* [Projections](#projections)
* [Counting documents](#counting-documents)
* [Updating documents](#updating-documents)
* [Removing documents](#removing-documents)
* [Indexing](#indexing)
* [Other environments](#other-environments)
### Creating/loading a database
You can use NeDB as an in-memory only datastore or as a persistent datastore.
One datastore is the equivalent of a MongoDB collection. The constructor is used
as follows [`new Datastore(options)` where `options` is an object](./API.md#new_Datastore_new).
If the Datastore is persistent (if you give it [`options.filename`](./API.md#Datastore+filename),
you'll need to load the database using [Datastore#loadDatabaseAsync](./API.md#Datastore+loadDatabaseAsync),
or using [`options.autoload`](./API.md#Datastore+autoload).
```javascript
// Type 1: In-memory only datastore (no need to load the database)
const Datastore = require('@seald-io/nedb')
const db = new Datastore()
// Type 2: Persistent datastore with manual loading
const Datastore = require('@seald-io/nedb')
const db = new Datastore({ filename: 'path/to/datafile' })
try {
await db.loadDatabaseAsync()
} catch (error) {
// loading has failed
}
// loading has succeeded
// Type 3: Persistent datastore with automatic loading
const Datastore = require('@seald-io/nedb')
const db = new Datastore({ filename: 'path/to/datafile', autoload: true }) // You can await db.autoloadPromise to catch a potential error when autoloading.
// You can issue commands right away
// Of course you can create multiple datastores if you need several
// collections. In this case it's usually a good idea to use autoload for all collections.
db = {}
db.users = new Datastore('path/to/users.db')
db.robots = new Datastore('path/to/robots.db')
// You need to load each database
await db.users.loadDatabaseAsync()
await db.robots.loadDatabaseAsync()
```
### Dropping a database
Since v3.0.0, you can drop the database by using [`Datastore#dropDatabaseAsync`](./API.md#Datastore+dropDatabaseAsync):
```js
const Datastore = require('@seald-io/nedb')
const db = new Datastore()
await d.insertAsync({ hello: 'world' })
await d.dropDatabaseAsync()
assert.equal(d.getAllData().length, 0)
assert.equal(await exists(testDb), false)
```
It is not recommended to keep using an instance of Datastore when its database
has been dropped as it may have some unintended side effects.
### Persistence
Under the hood, NeDB's [persistence](./API.md#Persistence) uses an append-only
format, meaning that all updates and deletes actually result in lines added at
the end of the datafile, for performance reasons. The database is automatically
compacted (i.e. put back in the one-line-per-document format) every time you
load each database within your application.
**Breaking change**: [since v3.0.0](./CHANGELOG.md#300---unreleased), calling methods of `yourDatabase.persistence`
is deprecated. The same functions exists directly on the `Datastore`.
You can manually call the compaction function
with [`yourDatabase#compactDatafileAsync`](./API.md#Datastore+compactDatafileAsync).
You can also set automatic compaction at regular intervals
with [`yourDatabase#setAutocompactionInterval`](./API.md#Datastore+setAutocompactionInterval),
and stop automatic compaction with [`yourDatabase#stopAutocompaction`](./API.md#Datastore+stopAutocompaction).
### Inserting documents
The native types are `String`, `Number`, `Boolean`, `Date` and `null`. You can
also use arrays and subdocuments (objects). If a field is `undefined`, it will
not be saved (this is different from MongoDB which transforms `undefined`
in `null`, something I find counter-intuitive).
If the document does not contain an `_id` field, NeDB will automatically
generate one for you (a 16-characters alphanumerical string). The `_id` of a
document, once set, cannot be modified.
Field names cannot start with '$' or contain the characters '.' and ','.
```javascript
const doc = {
hello: 'world',
n: 5,
today: new Date(),
nedbIsAwesome: true,
notthere: null,
notToBeSaved: undefined, // Will not be saved
fruits: ['apple', 'orange', 'pear'],
infos: { name: '@seald-io/nedb' }
}
try {
const newDoc = await db.insertAsync(doc)
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
} catch (error) {
// if an error happens
}
```
You can also bulk-insert an array of documents. This operation is atomic,
meaning that if one insert fails due to a unique constraint being violated, all
changes are rolled back.
```javascript
const newDocs = await db.insertAsync([{ a: 5 }, { a: 42 }])
// Two documents were inserted in the database
// newDocs is an array with these documents, augmented with their _id
// If there is a unique constraint on field 'a', this will fail
try {
await db.insertAsync([{ a: 5 }, { a: 42 }, { a: 5 }])
} catch (error) {
// err is a 'uniqueViolated' error
// The database was not modified
}
```
### Finding documents
Use `findAsync` to look for multiple documents matching you query, or `findOneAsync` to
look for one specific document. You can select documents based on field equality
or use comparison operators (`$lt`, `$lte`, `$gt`, `$gte`, `$in`, `$nin`, `$ne`)
. You can also use logical operators `$or`, `$and`, `$not` and `$where`. See
below for the syntax.
You can use regular expressions in two ways: in basic querying in place of a
string, or with the `$regex` operator.
You can sort and paginate results using the cursor API (see below).
You can use standard projections to restrict the fields to appear in the
results (see below).
#### Basic querying
Basic querying means are looking for documents whose fields match the ones you
specify. You can use regular expression to match strings. You can use the dot
notation to navigate inside nested documents, arrays, arrays of subdocuments and
to match a specific element of an array.
```javascript
// Let's say our datastore contains the following collection
// { _id: 'id1', planet: 'Mars', system: 'solar', inhabited: false, satellites: ['Phobos', 'Deimos'] }
// { _id: 'id2', planet: 'Earth', system: 'solar', inhabited: true, humans: { genders: 2, eyes: true } }
// { _id: 'id3', planet: 'Jupiter', system: 'solar', inhabited: false }
// { _id: 'id4', planet: 'Omicron Persei 8', system: 'futurama', inhabited: true, humans: { genders: 7 } }
// { _id: 'id5', completeData: { planets: [ { name: 'Earth', number: 3 }, { name: 'Mars', number: 2 }, { name: 'Pluton', number: 9 } ] } }
// Finding all planets in the solar system
const docs = await db.findAsync({ system: 'solar' })
// docs is an array containing documents Mars, Earth, Jupiter
// If no document is found, docs is equal to []
// Finding all planets whose name contain the substring 'ar' using a regular expression
const docs = await db.findAsync({ planet: /ar/ })
// docs contains Mars and Earth
// Finding all inhabited planets in the solar system
const docs = await db.findAsync({ system: 'solar', inhabited: true })
// docs is an array containing document Earth only
// Use the dot-notation to match fields in subdocuments
const docs = await db.findAsync({ 'humans.genders': 2 })
// docs contains Earth
// Use the dot-notation to navigate arrays of subdocuments
const docs = await db.findAsync({ 'completeData.planets.name': 'Mars' })
// docs contains document 5
const docs = await db.findAsync({ 'completeData.planets.name': 'Jupiter' })
// docs is empty
const docs = await db.findAsync({ 'completeData.planets.0.name': 'Earth' })
// docs contains document 5
// If we had tested against 'Mars' docs would be empty because we are matching against a specific array element
// You can also deep-compare objects. Don't confuse this with dot-notation!
const docs = await db.findAsync({ humans: { genders: 2 } })
// docs is empty, because { genders: 2 } is not equal to { genders: 2, eyes: true }
// Find all documents in the collection
const docs = await db.findAsync({})
// The same rules apply when you want to only find one document
const doc = await db.findOneAsync({ _id: 'id1' })
// doc is the document Mars
// If no document is found, doc is null
```
#### Operators ($lt, $lte, $gt, $gte, $in, $nin, $ne, $exists, $regex)
The syntax is `{ field: { $op: value } }` where `$op` is any comparison
operator:
* `$lt`, `$lte`: less than, less than or equal
* `$gt`, `$gte`: greater than, greater than or equal
* `$in`: member of. `value` must be an array of values
* `$ne`, `$nin`: not equal, not a member of
* `$exists`: checks whether the document posses the property `field`. `value`
should be true or false
* `$regex`: checks whether a string is matched by the regular expression.
Contrary to MongoDB, the use of `$options` with `$regex` is not supported,
because it doesn't give you more power than regex flags. Basic queries are
more readable so only use the `$regex` operator when you need to use another
operator with it (see example below)
```javascript
// $lt, $lte, $gt and $gte work on numbers and strings
const docs = await db.findAsync({ 'humans.genders': { $gt: 5 } })
// docs contains Omicron Persei 8, whose humans have more than 5 genders (7).
// When used with strings, lexicographical order is used
const docs = await db.findAsync({ planet: { $gt: 'Mercury' } })
// docs contains Omicron Persei 8
// Using $in. $nin is used in the same way
const docs = await db.findAsync({ planet: { $in: ['Earth', 'Jupiter'] } })
// docs contains Earth and Jupiter
// Using $exists
const docs = await db.findAsync({ satellites: { $exists: true } })
// docs contains only Mars
// Using $regex with another operator
const docs = await db.findAsync({
planet: {
$regex: /ar/,
$nin: ['Jupiter', 'Earth']
}
})
// docs only contains Mars because Earth was excluded from the match by $nin
```
#### Array fields
When a field in a document is an array, NeDB first tries to see if the query
value is an array to perform an exact match, then whether there is an
array-specific comparison function (for now there is only `$size`
and `$elemMatch`) being used. If not, the query is treated as a query on every
element and there is a match if at least one element matches.
* `$size`: match on the size of the array
* `$elemMatch`: matches if at least one array element matches the query entirely
```javascript
// Exact match
const docs = await db.findAsync({ satellites: ['Phobos', 'Deimos'] })
// docs contains Mars
const docs = await db.findAsync({ satellites: ['Deimos', 'Phobos'] })
// docs is empty
// Using an array-specific comparison function
// $elemMatch operator will provide match for a document, if an element from the array field satisfies all the conditions specified with the `$elemMatch` operator
const docs = await db.findAsync({
completeData: {
planets: {
$elemMatch: {
name: 'Earth',
number: 3
}
}
}
})
// docs contains documents with id 5 (completeData)
const docs = await db.findAsync({
completeData: {
planets: {
$elemMatch: {
name: 'Earth',
number: 5
}
}
}
})
// docs is empty
// You can use inside #elemMatch query any known document query operator
const docs = await db.findAsync({
completeData: {
planets: {
$elemMatch: {
name: 'Earth',
number: { $gt: 2 }
}
}
}
})
// docs contains documents with id 5 (completeData)
// Note: you can't use nested comparison functions, e.g. { $size: { $lt: 5 } } will throw an error
const docs = await db.findAsync({ satellites: { $size: 2 } })
// docs contains Mars
const docs = await db.findAsync({ satellites: { $size: 1 } })
// docs is empty
// If a document's field is an array, matching it means matching any element of the array
const docs = await db.findAsync({ satellites: 'Phobos' })
// docs contains Mars. Result would have been the same if query had been { satellites: 'Deimos' }
// This also works for queries that use comparison operators
const docs = await db.findAsync({ satellites: { $lt: 'Amos' } })
// docs is empty since Phobos and Deimos are after Amos in lexicographical order
// This also works with the $in and $nin operator
const docs = await db.findAsync({ satellites: { $in: ['Moon', 'Deimos'] } })
// docs contains Mars (the Earth document is not complete!)
```
#### Logical operators $or, $and, $not, $where
You can combine queries using logical operators:
* For `$or` and `$and`, the syntax is `{ $op: [query1, query2, ...] }`.
* For `$not`, the syntax is `{ $not: query }`
* For `$where`, the syntax
is `{ $where: function () { /* object is 'this', return a boolean */ } }`
```javascript
const docs = await db.findAsync({ $or: [{ planet: 'Earth' }, { planet: 'Mars' }] })
// docs contains Earth and Mars
const docs = await db.findAsync({ $not: { planet: 'Earth' } })
// docs contains Mars, Jupiter, Omicron Persei 8
const docs = await db.findAsync({ $where: function () { return Object.keys(this) > 6 } })
// docs with more than 6 properties
// You can mix normal queries, comparison queries and logical operators
const docs = await db.findAsync({
$or: [{ planet: 'Earth' }, { planet: 'Mars' }],
inhabited: true
})
// docs contains Earth
```
#### Sorting and paginating
[`Datastore#findAsync`](./API.md#Datastore+findAsync),
[`Datastore#findOneAsync`](./API.md#Datastore+findOneAsync) and
[`Datastore#countAsync`](./API.md#Datastore+countAsync) don't
actually return a `Promise`, but a [`Cursor`](./API.md#Cursor) which is a
[`Thenable`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/await#thenable_objects)
which calls [`Cursor#execAsync`](./API.md#Cursor+execAsync) when awaited.
This pattern allows to chain [`Cursor#sort`](./API.md#Cursor+sort),
[`Cursor#skip`](./API.md#Cursor+skip),
[`Cursor#limit`](./API.md#Cursor+limit) and
[`Cursor#projection`](./API.md#Cursor+projection) and await the result.
```javascript
// Let's say the database contains these 4 documents
// doc1 = { _id: 'id1', planet: 'Mars', system: 'solar', inhabited: false, satellites: ['Phobos', 'Deimos'] }
// doc2 = { _id: 'id2', planet: 'Earth', system: 'solar', inhabited: true, humans: { genders: 2, eyes: true } }
// doc3 = { _id: 'id3', planet: 'Jupiter', system: 'solar', inhabited: false }
// doc4 = { _id: 'id4', planet: 'Omicron Persei 8', system: 'futurama', inhabited: true, humans: { genders: 7 } }
// No query used means all results are returned (before the Cursor modifiers)
const docs = await db.findAsync({}).sort({ planet: 1 }).skip(1).limit(2)
// docs is [doc3, doc1]
// You can sort in reverse order like this
const docs = await db.findAsync({ system: 'solar' }).sort({ planet: -1 })
// docs is [doc1, doc3, doc2]
// You can sort on one field, then another, and so on like this:
const docs = await db.findAsync({}).sort({ firstField: 1, secondField: -1 })
// ... You understand how this works!
```
#### Projections
You can give `findAsync` and `findOneAsync` an optional second argument, `projections`.
The syntax is the same as MongoDB: `{ a: 1, b: 1 }` to return only the `a`
and `b` fields, `{ a: 0, b: 0 }` to omit these two fields. You cannot use both
modes at the time, except for `_id` which is by default always returned and
which you can choose to omit. You can project on nested documents.
```javascript
// Same database as above
// Keeping only the given fields
const docs = await db.findAsync({ planet: 'Mars' }, { planet: 1, system: 1 })
// docs is [{ planet: 'Mars', system: 'solar', _id: 'id1' }]
// Keeping only the given fields but removing _id
const docs = await db.findAsync({ planet: 'Mars' }, {
planet: 1,
system: 1,
_id: 0
})
// docs is [{ planet: 'Mars', system: 'solar' }]
// Omitting only the given fields and removing _id
const docs = await db.findAsync({ planet: 'Mars' }, {
planet: 0,
system: 0,
_id: 0
})
// docs is [{ inhabited: false, satellites: ['Phobos', 'Deimos'] }]
// Failure: using both modes at the same time
const docs = await db.findAsync({ planet: 'Mars' }, { planet: 0, system: 1 })
// err is the error message, docs is undefined
// You can also use it in a Cursor way but this syntax is not compatible with MongoDB
const docs = await db.findAsync({ planet: 'Mars' }).projection({
planet: 1,
system: 1
})
// docs is [{ planet: 'Mars', system: 'solar', _id: 'id1' }]
// Project on a nested document
const doc = await db.findOneAsync({ planet: 'Earth' }).projection({
planet: 1,
'humans.genders': 1
})
// doc is { planet: 'Earth', _id: 'id2', humans: { genders: 2 } }
```
### Counting documents
You can use `countAsync` to count documents. It has the same syntax as `findAsync`.
For example:
```javascript
// Count all planets in the solar system
const count = await db.countAsync({ system: 'solar' })
// count equals to 3
// Count all documents in the datastore
const count = await db.countAsync({})
// count equals to 4
```
### Updating documents
[`db.updateAsync(query, update, options)`](./API.md#Datastore+updateAsync)
will update all documents matching `query` according to the `update` rules.
`update` specifies how the documents should be modified. It is either a new
document or a set of modifiers (you cannot use both together):
* A new document will replace the matched docs;
* Modifiers create the fields they need to modify if they don't exist,
and you can apply them to subdocs (see [the API reference]((./API.md#Datastore+updateAsync)))
`options` is an object with three possible parameters:
* `multi` which allows the modification of several documents if set to true.
* `upsert` will insert a new document corresponding if it doesn't exist (either
the `update` is a simple object with no modifiers, or the `query` modified by
the modifiers in the `update`) if set to `true`.
* `returnUpdatedDocs` will return the array of documents matched by the find
query and updated (updated documents will be returned even if the update did not
actually modify them) if set to `true`.
It resolves into an Object with the following properties:
- `numAffected`: how many documents were affected by the update;
- `upsert`: if a document was actually upserted (not always the same as `options.upsert`;
- `affectedDocuments`:
- if `upsert` is `true` the document upserted;
- if `options.returnUpdatedDocs` is `true` either the affected document or, if `options.multi` is `true` an Array of the affected documents, else `null`;
**Note**: you can't change a document's _id.
```javascript
// Let's use the same example collection as in the 'finding document' part
// { _id: 'id1', planet: 'Mars', system: 'solar', inhabited: false }
// { _id: 'id2', planet: 'Earth', system: 'solar', inhabited: true }
// { _id: 'id3', planet: 'Jupiter', system: 'solar', inhabited: false }
// { _id: 'id4', planet: 'Omicron Persia 8', system: 'futurama', inhabited: true }
// Replace a document by another
const { numAffected } = await db.updateAsync({ planet: 'Jupiter' }, { planet: 'Pluton' }, {})
// numAffected = 1
// The doc #3 has been replaced by { _id: 'id3', planet: 'Pluton' }
// Note that the _id is kept unchanged, and the document has been replaced
// (the 'system' and inhabited fields are not here anymore)
// Set an existing field's value
const { numAffected } = await db.updateAsync({ system: 'solar' }, { $set: { system: 'solar system' } }, { multi: true })
// numAffected = 3
// Field 'system' on Mars, Earth, Jupiter now has value 'solar system'
// Setting the value of a non-existing field in a subdocument by using the dot-notation
await db.updateAsync({ planet: 'Mars' }, {
$set: {
'data.satellites': 2,
'data.red': true
}
}, {})
// Mars document now is { _id: 'id1', system: 'solar', inhabited: false
// , data: { satellites: 2, red: true }
// }
// Not that to set fields in subdocuments, you HAVE to use dot-notation
// Using object-notation will just replace the top-level field
await db.updateAsync({ planet: 'Mars' }, { $set: { data: { satellites: 3 } } }, {})
// Mars document now is { _id: 'id1', system: 'solar', inhabited: false
// , data: { satellites: 3 }
// }
// You lost the 'data.red' field which is probably not the intended behavior
// Deleting a field
await db.updateAsync({ planet: 'Mars' }, { $unset: { planet: true } }, {})
// Now the document for Mars doesn't contain the planet field
// You can unset nested fields with the dot notation of course
// Upserting a document
const { numAffected, affectedDocuments, upsert } = await db.updateAsync({ planet: 'Pluton' }, {
planet: 'Pluton',
inhabited: false
}, { upsert: true })
// numAffected = 1, affectedDocuments = { _id: 'id5', planet: 'Pluton', inhabited: false }, upsert = true
// A new document { _id: 'id5', planet: 'Pluton', inhabited: false } has been added to the collection
// If you upsert with a modifier, the upserted doc is the query modified by the modifier
// This is simpler than it sounds :)
await db.updateAsync({ planet: 'Pluton' }, { $inc: { distance: 38 } }, { upsert: true })
// A new document { _id: 'id5', planet: 'Pluton', distance: 38 } has been added to the collection
// If we insert a new document { _id: 'id6', fruits: ['apple', 'orange', 'pear'] } in the collection,
// let's see how we can modify the array field atomically
// $push inserts new elements at the end of the array
await db.updateAsync({ _id: 'id6' }, { $push: { fruits: 'banana' } }, {})
// Now the fruits array is ['apple', 'orange', 'pear', 'banana']
// $pop removes an element from the end (if used with 1) or the front (if used with -1) of the array
await db.updateAsync({ _id: 'id6' }, { $pop: { fruits: 1 } }, {})
// Now the fruits array is ['apple', 'orange']
// With { $pop: { fruits: -1 } }, it would have been ['orange', 'pear']
// $addToSet adds an element to an array only if it isn't already in it
// Equality is deep-checked (i.e. $addToSet will not insert an object in an array already containing the same object)
// Note that it doesn't check whether the array contained duplicates before or not
await db.updateAsync({ _id: 'id6' }, { $addToSet: { fruits: 'apple' } }, {})
// The fruits array didn't change
// If we had used a fruit not in the array, e.g. 'banana', it would have been added to the array
// $pull removes all values matching a value or even any NeDB query from the array
await db.updateAsync({ _id: 'id6' }, { $pull: { fruits: 'apple' } }, {})
// Now the fruits array is ['orange', 'pear']
await db.updateAsync({ _id: 'id6' }, { $pull: { fruits: { $in: ['apple', 'pear'] } } }, {})
// Now the fruits array is ['orange']
// $each can be used to $push or $addToSet multiple values at once
// This example works the same way with $addToSet
await db.updateAsync({ _id: 'id6' }, { $push: { fruits: { $each: ['banana', 'orange'] } } }, {})
// Now the fruits array is ['apple', 'orange', 'pear', 'banana', 'orange']
// $slice can be used in cunjunction with $push and $each to limit the size of the resulting array.
// A value of 0 will update the array to an empty array. A positive value n will keep only the n first elements
// A negative value -n will keep only the last n elements.
// If $slice is specified but not $each, $each is set to []
await db.updateAsync({ _id: 'id6' }, {
$push: {
fruits: {
$each: ['banana'],
$slice: 2
}
}
})
// Now the fruits array is ['apple', 'orange']
// $min/$max to update only if provided value is less/greater than current value
// Let's say the database contains this document
// doc = { _id: 'id', name: 'Name', value: 5 }
await db.updateAsync({ _id: 'id1' }, { $min: { value: 2 } }, {})
// The document will be updated to { _id: 'id', name: 'Name', value: 2 }
await db.updateAsync({ _id: 'id1' }, { $min: { value: 8 } }, {})
// The document will not be modified
```
### Removing documents
[`db.removeAsync(query, options)`](./API.md#Datastore#removeAsync)
will remove documents matching `query`. Can remove multiple documents if
`options.multi` is set. Returns the `Promise<numRemoved>`.
```javascript
// Let's use the same example collection as in the "finding document" part
// { _id: 'id1', planet: 'Mars', system: 'solar', inhabited: false }
// { _id: 'id2', planet: 'Earth', system: 'solar', inhabited: true }
// { _id: 'id3', planet: 'Jupiter', system: 'solar', inhabited: false }
// { _id: 'id4', planet: 'Omicron Persia 8', system: 'futurama', inhabited: true }
// Remove one document from the collection
// options set to {} since the default for multi is false
const { numRemoved } = await db.removeAsync({ _id: 'id2' }, {})
// numRemoved = 1
// Remove multiple documents
const { numRemoved } = await db.removeAsync({ system: 'solar' }, { multi: true })
// numRemoved = 3
// All planets from the solar system were removed
// Removing all documents with the 'match-all' query
const { numRemoved } = await db.removeAsync({}, { multi: true })
```
### Indexing
NeDB supports indexing. It gives a very nice speed boost and can be used to
enforce a unique constraint on a field. You can index any field, including
fields in nested documents using the dot notation. For now, indexes are only
used to speed up basic queries and queries using `$in`, `$lt`, `$lte`, `$gt`
and `$gte`. The indexed values cannot be of type array of object.
**Breaking change**: [since v4.0.0](./CHANGELOG.md#400---2023-01-20), commas (`,`) can no longer be used in indexed field names.
The following is illegal:
```javascript
db.ensureIndexAsync({ fieldName: 'some,field' })
db.ensureIndexAsync({ fieldName: ['some,field', 'other,field'] })
```
This is a side effect of the compound index implementation.
To create an index, use [`datastore#ensureIndexAsync(options)`](./API.md#Datastore+ensureIndexAsync).
It resolves when the index is persisted on disk (if the database is persistent)
and may throw an Error (usually a unique constraint that was violated). It can
be called when you want, even after some data was inserted, though it's best to
call it at application startup. The options are:
* **fieldName** (required): name of the field to index. Use the dot notation to
index a field in a nested document. For a compound index, use an array of field names.
* **unique** (optional, defaults to `false`): enforce field uniqueness.
* **sparse** (optional, defaults to `false`): don't index documents for which
the field is not defined.
* **expireAfterSeconds** (number of seconds, optional): if set, the created
index is a TTL (time to live) index, that will automatically remove documents
when the indexed field value is older than `expireAfterSeconds`.
Note: the `_id` is automatically indexed with a unique constraint.
You can remove a previously created index with
[`datastore#removeIndexAsync(fieldName)`](./API.md#Datastore+removeIndexAsync).
```javascript
try {
await db.ensureIndexAsync({ fieldName: 'somefield' })
} catch (error) {
// If there was an error, error is not null
}
// Using a unique constraint with the index
await db.ensureIndexAsync({ fieldName: 'somefield', unique: true })
// Using a sparse unique index
await db.ensureIndexAsync({
fieldName: 'somefield',
unique: true,
sparse: true
})
// Using a compound index
await db.ensureIndexAsync({ fieldName: ["field1", "field2"] });
try {
// Format of the error message when the unique constraint is not met
await db.insertAsync({ somefield: '@seald-io/nedb' })
// works
await db.insertAsync({ somefield: '@seald-io/nedb' })
//rejects
} catch (error) {
// error is { errorType: 'uniqueViolated',
// key: 'name',
// message: 'Unique constraint violated for key name' }
}
// Remove index on field somefield
await db.removeIndexAsync('somefield')
// Example of using expireAfterSeconds to remove documents 1 hour
// after their creation (db's timestampData option is true here)
await db.ensureIndex({
fieldName: 'createdAt',
expireAfterSeconds: 3600
})
// You can also use the option to set an expiration date like so
await db.ensureIndex({
fieldName: 'expirationDate',
expireAfterSeconds: 0
})
// Now all documents will expire when system time reaches the date in their
// expirationDate field
```
## Other environments
NeDB runs on Node.js (it is tested on Node 12, 14 and 16), the browser (it is
tested on the latest version of Chrome) and React-Native using
[@react-native-async-storage/async-storage](https://github.com/react-native-async-storage/async-storage).
### Browser bundle
The npm package contains a bundle and its minified counterpart for the browser.
They are located in the `browser-version/out` directory. You only need to require `nedb.js`
or `nedb.min.js` in your HTML file and the global object `Nedb` can be used
right away, with the same API as the server version:
```
<script src="nedb.min.js"></script>
<script>
var db = new Nedb(); // Create an in-memory only datastore
db.insert({ planet: 'Earth' }, function (err) {
db.find({}, function (err, docs) {
// docs contains the two planets Earth and Mars
});
});
</script>
```
If you specify a `filename`, the database will be persistent, and automatically
select the best storage method available using [localforage](https://github.com/localForage/localForage)
(IndexedDB, WebSQL or localStorage) depending on the browser. In most cases that
means a lot of data can be stored, typically in hundreds of MB.
**WARNING**: the storage system changed between
v1.3 and v1.4 and is NOT back-compatible! Your application needs to resync
client-side when you upgrade NeDB.
NeDB uses modern Javascript features such as `async`, `Promise`, `class`, `const`
, `let`, `Set`, `Map`, ... The bundle does not polyfill these features. If you
need to target another environment, you will need to make your own bundle.
### Using the `browser` and `react-native` fields
NeDB uses the `browser` and `react-native` fields to replace some modules by an
environment specific shim.
The way this works is by counting on the bundler that will package NeDB to use
this fields. This is [done by default by Webpack](https://webpack.js.org/configuration/resolve/#resolvealiasfields)
for the `browser` field. And this is [done by default by Metro](https://github.com/facebook/metro/blob/c21daba415ea26511e157f794689caab9abe8236/packages/metro/src/ModuleGraph/node-haste/Package.js#L108)
for the `react-native` field.
This is done for:
- the [storage module](./lib/storage.js) which uses Node.js `fs`. It is
[replaced in the browser](./browser-version/lib/storage.browser.js) by one that uses
[localforage](https://github.com/localForage/localForage), and
[in `react-native`](./browser-version/lib/storage.react-native.js) by one that uses
[@react-native-async-storage/async-storage](https://github.com/react-native-async-storage/async-storage)
- the [customUtils module](./browser-version/lib/customUtils.js) which uses Node.js
`crypto` module. It is replaced by a good enough shim to generate ids that uses `Math.random()`.
- the [byline module](./browser-version/lib/byline.js) which uses Node.js `stream`
(a fork of [`node-byline`](https://github.com/jahewson/node-byline) included in
the repo because it is unmaintained). It isn't used in the browser nor
react-native versions, therefore it is shimmed with an empty object.
However, the `browser` and `react-native` versions rely on node native modules and therefore must be polyfilled:
- `util` with https://github.com/browserify/node-util.
- `events` with https://github.com/browserify/events.
## Performance
### Speed
NeDB is not intended to be a replacement of large-scale databases such as
MongoDB, and as such was not designed for speed. That said, it is still pretty
fast on the expected datasets, especially if you use indexing. On a typical,
not-so-fast dev machine, for a collection containing 10,000 documents, with
indexing:
* Insert: **10,680 ops/s**
* Find: **43,290 ops/s**
* Update: **8,000 ops/s**
* Remove: **11,750 ops/s**
You can run these simple benchmarks by executing the scripts in the `benchmarks`
folder. Run them with the `--help` flag to see how they work.
### Memory footprint
A copy of the whole database is kept in memory. This is not much on the expected
kind of datasets (20MB for 10,000 2KB documents).
## Use in other services
* An ODM for NeDB: [follicle](https://github.com/seald/follicle)
* A layer to add a promise-only interface: [nedb-promises](https://www.npmjs.com/package/nedb-promises)
## Modernization
This fork of NeDB will be incrementally updated to:
* [ ] cleanup the benchmark and update the performance statistics;
* [ ] remove deprecated features;
* [ ] add a way to change the `Storage` module by dependency injection, which will
pave the way to cleaner browser react-native versions (cf https://github.com/seald/nedb/pull/19).
* [x] use `async` functions and `Promises` instead of callbacks with `async@0.2.6`;
* [x] expose a `Promise`-based interface;
* [x] remove the `underscore` dependency;
## Pull requests guidelines
If you submit a pull request, thanks! There are a couple rules to follow though
to make it manageable:
* The pull request should be atomic, i.e. contain only one feature. If it
contains more, please submit multiple pull requests. Reviewing massive, 1000
loc+ pull requests is extremely hard.
* Likewise, if for one unique feature the pull request grows too large (more
than 200 loc tests not included), please get in touch first.
* Please stick to the current coding style. It's important that the code uses a
coherent style for readability (this package uses [`standard`](https://standardjs.com/)).
* Do not include stylistic improvements ('housekeeping'). If you think one part
deserves lots of housekeeping, use a separate pull request so as not to
pollute the code.
* Don't forget tests for your new feature. Also don't forget to run the whole
test suite before submitting to make sure you didn't introduce regressions.
* Update the JSDoc and regenerate [the markdown docs](./API.md).
* Update the readme accordingly.
## License
See [License](LICENSE.md)

View File

@ -0,0 +1 @@
module.exports = {}

View File

@ -0,0 +1,76 @@
/**
* Utility functions that need to be reimplemented for each environment.
* This is the version for the browser & React-Native
* @module customUtilsBrowser
* @private
*/
/**
* Taken from the crypto-browserify module
* https://github.com/dominictarr/crypto-browserify
* NOTE: Math.random() does not guarantee "cryptographic quality" but we actually don't need it
* @param {number} size in bytes
* @return {Array<number>}
*/
const randomBytes = size => {
const bytes = new Array(size)
for (let i = 0, r; i < size; i++) {
if ((i & 0x03) === 0) r = Math.random() * 0x100000000
bytes[i] = r >>> ((i & 0x03) << 3) & 0xff
}
return bytes
}
/**
* Taken from the base64-js module
* https://github.com/beatgammit/base64-js/
* @param {Array} uint8
* @return {string}
*/
const byteArrayToBase64 = uint8 => {
const lookup = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
const extraBytes = uint8.length % 3 // if we have 1 byte left, pad 2 bytes
let output = ''
let temp
const tripletToBase64 = num => lookup[num >> 18 & 0x3F] + lookup[num >> 12 & 0x3F] + lookup[num >> 6 & 0x3F] + lookup[num & 0x3F]
// go through the array every three bytes, we'll deal with trailing stuff later
for (let i = 0, length = uint8.length - extraBytes; i < length; i += 3) {
temp = (uint8[i] << 16) + (uint8[i + 1] << 8) + (uint8[i + 2])
output += tripletToBase64(temp)
}
// pad the end with zeros, but make sure to not forget the extra bytes
if (extraBytes === 1) {
temp = uint8[uint8.length - 1]
output += lookup[temp >> 2]
output += lookup[(temp << 4) & 0x3F]
output += '=='
} else if (extraBytes === 2) {
temp = (uint8[uint8.length - 2] << 8) + (uint8[uint8.length - 1])
output += lookup[temp >> 10]
output += lookup[(temp >> 4) & 0x3F]
output += lookup[(temp << 2) & 0x3F]
output += '='
}
return output
}
/**
* Return a random alphanumerical string of length len
* There is a very small probability (less than 1/1,000,000) for the length to be less than len
* (il the base64 conversion yields too many pluses and slashes) but
* that's not an issue here
* The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision)
* See http://en.wikipedia.org/wiki/Birthday_problem
* @param {number} len
* @return {string}
* @alias module:customUtilsNode.uid
*/
const uid = len => byteArrayToBase64(randomBytes(Math.ceil(Math.max(8, len * 2)))).replace(/[+/]/g, '').slice(0, len)
module.exports.uid = uid

View File

@ -0,0 +1,191 @@
/**
* Way data is stored for this database
*
* This version is the browser version and uses [localforage]{@link https://github.com/localForage/localForage} which chooses the best option depending on user browser (IndexedDB then WebSQL then localStorage).
* @module storageBrowser
* @see module:storage
* @see module:storageReactNative
* @private
*/
const localforage = require('localforage')
// Configure localforage to display NeDB name for now. Would be a good idea to let user use his own app name
const store = localforage.createInstance({
name: 'NeDB',
storeName: 'nedbdata'
})
/**
* Returns Promise<true> if file exists.
*
* @param {string} file
* @return {Promise<boolean>}
* @async
* @alias module:storageBrowser.existsAsync
*/
const existsAsync = async file => {
try {
const value = await store.getItem(file)
if (value !== null) return true // Even if value is undefined, localforage returns null
return false
} catch (error) {
return false
}
}
/**
* Moves the item from one path to another.
* @param {string} oldPath
* @param {string} newPath
* @return {Promise<void>}
* @alias module:storageBrowser.renameAsync
* @async
*/
const renameAsync = async (oldPath, newPath) => {
try {
const value = await store.getItem(oldPath)
if (value === null) await store.removeItem(newPath)
else {
await store.setItem(newPath, value)
await store.removeItem(oldPath)
}
} catch (err) {
console.warn('An error happened while renaming, skip')
}
}
/**
* Saves the item at given path.
* @param {string} file
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageBrowser.writeFileAsync
* @async
*/
const writeFileAsync = async (file, data, options) => {
// Options do not matter in browser setup
try {
await store.setItem(file, data)
} catch (error) {
console.warn('An error happened while writing, skip')
}
}
/**
* Append to the item at given path.
* @function
* @param {string} filename
* @param {string} toAppend
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageBrowser.appendFileAsync
* @async
*/
const appendFileAsync = async (filename, toAppend, options) => {
// Options do not matter in browser setup
try {
const contents = (await store.getItem(filename)) || ''
await store.setItem(filename, contents + toAppend)
} catch (error) {
console.warn('An error happened appending to file writing, skip')
}
}
/**
* Read data at given path.
* @function
* @param {string} filename
* @param {object} [options]
* @return {Promise<Buffer>}
* @alias module:storageBrowser.readFileAsync
* @async
*/
const readFileAsync = async (filename, options) => {
try {
return (await store.getItem(filename)) || ''
} catch (error) {
console.warn('An error happened while reading, skip')
return ''
}
}
/**
* Async version of {@link module:storageBrowser.unlink}.
* @function
* @param {string} filename
* @return {Promise<void>}
* @async
* @alias module:storageBrowser.unlink
*/
const unlinkAsync = async filename => {
try {
await store.removeItem(filename)
} catch (error) {
console.warn('An error happened while unlinking, skip')
}
}
/**
* Shim for {@link module:storage.mkdirAsync}, nothing to do, no directories will be used on the browser.
* @function
* @param {string} path
* @param {object} [options]
* @return {Promise<void|string>}
* @alias module:storageBrowser.mkdirAsync
* @async
*/
const mkdirAsync = (path, options) => Promise.resolve()
/**
* Shim for {@link module:storage.ensureParentDirectoryExistsAsync}, nothing to do, no directories will be used on the browser.
* @function
* @param {string} file
* @param {number} [mode]
* @return {Promise<void|string>}
* @alias module:storageBrowser.ensureParentDirectoryExistsAsync
* @async
*/
const ensureParentDirectoryExistsAsync = async (file, mode) => Promise.resolve()
/**
* Shim for {@link module:storage.ensureDatafileIntegrityAsync}, nothing to do, no data corruption possible in the browser.
* @param {string} filename
* @return {Promise<void>}
* @alias module:storageBrowser.ensureDatafileIntegrityAsync
*/
const ensureDatafileIntegrityAsync = (filename) => Promise.resolve()
/**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost)
* * @param {string} filename
* @param {string[]} lines
* @return {Promise<void>}
* @alias module:storageBrowser.crashSafeWriteFileLinesAsync
*/
const crashSafeWriteFileLinesAsync = async (filename, lines) => {
lines.push('') // Add final new line
await writeFileAsync(filename, lines.join('\n'))
}
// Interface
module.exports.existsAsync = existsAsync
module.exports.renameAsync = renameAsync
module.exports.writeFileAsync = writeFileAsync
module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.appendFileAsync = appendFileAsync
module.exports.readFileAsync = readFileAsync
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdirAsync = mkdirAsync
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync
module.exports.ensureParentDirectoryExistsAsync = ensureParentDirectoryExistsAsync

View File

@ -0,0 +1,295 @@
/**
* Way data is stored for this database
*
* This version is the React-Native version and uses [@react-native-async-storage/async-storage]{@link https://github.com/react-native-async-storage/async-storage}.
* @module storageReactNative
* @see module:storageBrowser
* @see module:storage
* @private
*/
const AsyncStorage = require('@react-native-async-storage/async-storage').default
const { callbackify } = require('util')
/**
* Async version of {@link module:storageReactNative.exists}.
* @param {string} file
* @return {Promise<boolean>}
* @async
* @alias module:storageReactNative.existsAsync
* @see module:storageReactNative.exists
*/
const existsAsync = async file => {
try {
const value = await AsyncStorage.getItem(file)
if (value !== null) return true // Even if value is undefined, AsyncStorage returns null
return false
} catch (error) {
return false
}
}
/**
* @callback module:storageReactNative~existsCallback
* @param {boolean} exists
*/
/**
* Callback returns true if file exists
* @function
* @param {string} file
* @param {module:storageReactNative~existsCallback} cb
* @alias module:storageReactNative.exists
*/
const exists = callbackify(existsAsync)
/**
* Async version of {@link module:storageReactNative.rename}.
* @param {string} oldPath
* @param {string} newPath
* @return {Promise<void>}
* @alias module:storageReactNative.renameAsync
* @async
* @see module:storageReactNative.rename
*/
const renameAsync = async (oldPath, newPath) => {
try {
const value = await AsyncStorage.getItem(oldPath)
if (value === null) await AsyncStorage.removeItem(newPath)
else {
await AsyncStorage.setItem(newPath, value)
await AsyncStorage.removeItem(oldPath)
}
} catch (err) {
console.warn('An error happened while renaming, skip')
}
}
/**
* Moves the item from one path to another
* @function
* @param {string} oldPath
* @param {string} newPath
* @param {NoParamCallback} c
* @return {void}
* @alias module:storageReactNative.rename
*/
const rename = callbackify(renameAsync)
/**
* Async version of {@link module:storageReactNative.writeFile}.
* @param {string} file
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageReactNative.writeFileAsync
* @async
* @see module:storageReactNative.writeFile
*/
const writeFileAsync = async (file, data, options) => {
// Options do not matter in react-native setup
try {
await AsyncStorage.setItem(file, data)
} catch (error) {
console.warn('An error happened while writing, skip')
}
}
/**
* Saves the item at given path
* @function
* @param {string} path
* @param {string} data
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.writeFile
*/
const writeFile = callbackify(writeFileAsync)
/**
* Async version of {@link module:storageReactNative.appendFile}.
* @function
* @param {string} filename
* @param {string} toAppend
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageReactNative.appendFileAsync
* @async
* @see module:storageReactNative.appendFile
*/
const appendFileAsync = async (filename, toAppend, options) => {
// Options do not matter in react-native setup
try {
const contents = (await AsyncStorage.getItem(filename)) || ''
await AsyncStorage.setItem(filename, contents + toAppend)
} catch (error) {
console.warn('An error happened appending to file writing, skip')
}
}
/**
* Append to the item at given path
* @function
* @param {string} filename
* @param {string} toAppend
* @param {object} [options]
* @param {function} callback
* @alias module:storageReactNative.appendFile
*/
const appendFile = callbackify(appendFileAsync)
/**
* Async version of {@link module:storageReactNative.readFile}.
* @function
* @param {string} filename
* @param {object} [options]
* @return {Promise<string>}
* @alias module:storageReactNative.readFileAsync
* @async
* @see module:storageReactNative.readFile
*/
const readFileAsync = async (filename, options) => {
try {
return (await AsyncStorage.getItem(filename)) || ''
} catch (error) {
console.warn('An error happened while reading, skip')
return ''
}
}
/**
* Read data at given path
* @function
* @param {string} filename
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.readFile
*/
const readFile = callbackify(readFileAsync)
/**
* Async version of {@link module:storageReactNative.unlink}.
* @function
* @param {string} filename
* @return {Promise<void>}
* @async
* @alias module:storageReactNative.unlinkAsync
* @see module:storageReactNative.unlink
*/
const unlinkAsync = async filename => {
try {
await AsyncStorage.removeItem(filename)
} catch (error) {
console.warn('An error happened while unlinking, skip')
}
}
/**
* Remove the data at given path
* @function
* @param {string} path
* @param {function} callback
* @alias module:storageReactNative.unlink
*/
const unlink = callbackify(unlinkAsync)
/**
* Shim for {@link module:storage.mkdirAsync}, nothing to do, no directories will be used on the react-native.
* @function
* @param {string} dir
* @param {object} [options]
* @return {Promise<void|string>}
* @alias module:storageReactNative.mkdirAsync
* @async
*/
const mkdirAsync = (dir, options) => Promise.resolve()
/**
* Shim for {@link module:storage.mkdir}, nothing to do, no directories will be used on the react-native.
* @function
* @param {string} path
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.mkdir
*/
const mkdir = callbackify(mkdirAsync)
/**
* Shim for {@link module:storage.ensureDatafileIntegrityAsync}, nothing to do, no data corruption possible in the react-native.
* @param {string} filename
* @return {Promise<void>}
* @alias module:storageReactNative.ensureDatafileIntegrityAsync
*/
const ensureDatafileIntegrityAsync = (filename) => Promise.resolve()
/**
* Shim for {@link module:storage.ensureDatafileIntegrity}, nothing to do, no data corruption possible in the react-native.
* @function
* @param {string} filename
* @param {NoParamCallback} callback signature: err
* @alias module:storageReactNative.ensureDatafileIntegrity
*/
const ensureDatafileIntegrity = callbackify(ensureDatafileIntegrityAsync)
/**
* Async version of {@link module:storageReactNative.crashSafeWriteFileLines}.
* @param {string} filename
* @param {string[]} lines
* @return {Promise<void>}
* @alias module:storageReactNative.crashSafeWriteFileLinesAsync
* @see module:storageReactNative.crashSafeWriteFileLines
*/
const crashSafeWriteFileLinesAsync = async (filename, lines) => {
lines.push('') // Add final new line
await writeFileAsync(filename, lines.join('\n'))
}
/**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost)
* @function
* @param {string} filename
* @param {string[]} lines
* @param {NoParamCallback} [callback] Optional callback, signature: err
* @alias module:storageReactNative.crashSafeWriteFileLines
*/
const crashSafeWriteFileLines = callbackify(crashSafeWriteFileLinesAsync)
/**
* Shim for {@link module:storage.ensureParentDirectoryExistsAsync}, nothing to do, no directories will be used on the browser.
* @function
* @param {string} file
* @param {number} [mode]
* @return {Promise<void|string>}
* @alias module:storageBrowser.ensureParentDirectoryExistsAsync
* @async
*/
const ensureParentDirectoryExistsAsync = async (file, mode) => Promise.resolve()
// Interface
module.exports.exists = exists
module.exports.existsAsync = existsAsync
module.exports.rename = rename
module.exports.renameAsync = renameAsync
module.exports.writeFile = writeFile
module.exports.writeFileAsync = writeFileAsync
module.exports.crashSafeWriteFileLines = crashSafeWriteFileLines
module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.appendFile = appendFile
module.exports.appendFileAsync = appendFileAsync
module.exports.readFile = readFile
module.exports.readFileAsync = readFileAsync
module.exports.unlink = unlink
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdir = mkdir
module.exports.mkdirAsync = mkdirAsync
module.exports.ensureDatafileIntegrity = ensureDatafileIntegrity
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync
module.exports.ensureParentDirectoryExistsAsync = ensureParentDirectoryExistsAsync

10155
node_modules/@seald-io/nedb/browser-version/out/nedb.js generated vendored Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

206
node_modules/@seald-io/nedb/index.d.ts generated vendored Normal file
View File

@ -0,0 +1,206 @@
// Type definitions for @seald-io/nedb 2.1.0
// Project: https://github.com/seald/nedb forked from https://github.com/louischatriot/nedb
// Definitions by: Timothée Rebours <https://gihub.com/tex0l>
// Mehdi Kouhen <https://github.com/arantes555>
// Stefan Steinhart <https://github.com/reppners>
// Anthony Nichols <https://github.com/anthonynichols>
// Alejandro Fernandez Haro <https://github.com/afharo>
// Pierre de la Martinière <https://github.com/martpie>
// TypeScript Version: 4.9
/// <reference types="node" />
import { EventEmitter } from "events";
export default Nedb;
export type Document<Schema> = Schema & {
_id: string;
};
declare class Nedb<Schema = Record<string, any>> extends EventEmitter {
constructor(pathOrOptions?: string | Nedb.DataStoreOptions);
persistence: Nedb.Persistence;
autoloadPromise: Promise<void> | null;
loadDatabase(callback?: (err: Error | null) => void): void;
loadDatabaseAsync(): Promise<void>;
dropDatabase(callback?: (err: Error | null) => void): void;
dropDatabaseAsync(): Promise<void>;
compactDatafile(callback?: (err: Error | null) => void): void;
compactDatafileAsync(): Promise<void>;
setAutocompactionInterval(interval: number): void;
stopAutocompaction(): void;
getAllData<T extends Schema>(): Document<T>[];
ensureIndex(
options: Nedb.EnsureIndexOptions,
callback?: (err: Error | null) => void
): void;
ensureIndexAsync(options: Nedb.EnsureIndexOptions): Promise<void>;
removeIndex(fieldName: string | string[], callback?: (err: Error | null) => void): void;
removeIndexAsync(fieldName: string | string[]): Promise<void>;
insert<T extends Schema>(
newDoc: T,
callback?: (err: Error | null, document: Document<T>) => void
): void;
insert<T extends Schema>(
newDocs: T[],
callback?: (err: Error | null, documents: Document<T>[]) => void
): void;
insertAsync<T extends Schema>(newDoc: T): Promise<Document<T>>;
insertAsync<T extends Schema>(newDocs: T[]): Promise<Document<T>[]>;
count(query: any, callback: (err: Error | null, n: number) => void): void;
count(query: any): Nedb.CursorCount;
countAsync(query: any): Nedb.Cursor<number>;
find<T extends Schema>(
query: any,
projection: any,
callback?: (err: Error | null, documents: Document<T>[]) => void
): void;
find<T extends Schema>(
query: any,
projection?: any
): Nedb.Cursor<T>;
find<T extends Schema>(
query: any,
callback: (err: Error | null, documents: Document<T>[]) => void
): void;
findAsync<T extends Schema>(
query: any,
projection?: any
): Nedb.Cursor<T[]>;
findOne<T extends Schema>(
query: any,
projection: any,
callback: (err: Error | null, document: Document<T>) => void
): void;
findOne<T extends Schema>(
query: any,
callback: (err: Error | null, document: Document<T>) => void
): void;
findOneAsync<T extends Schema>(
query: any,
projection?: any
): Nedb.Cursor<T>;
update<T extends Schema, O extends Nedb.UpdateOptions>(
query: any,
updateQuery: any,
options?: O,
callback?: (
err: Error | null,
numberOfUpdated: number,
affectedDocuments: O['returnUpdatedDocs'] extends true ? O['multi'] extends true ? Document<T>[] | null : Document<T> | null : null,
upsert: boolean | null
) => void
): void;
updateAsync<T extends Schema, O extends Nedb.UpdateOptions>(
query: any,
updateQuery: any,
options?: O
): Promise<{
numAffected: number;
affectedDocuments: O['returnUpdatedDocs'] extends true ? O['multi'] extends true ? Document<T>[] | null : Document<T> | null : null;
upsert: boolean;
}>;
remove(
query: any,
options: Nedb.RemoveOptions,
callback?: (err: Error | null, n: number) => void
): void;
remove(query: any, callback?: (err: Error | null, n: number) => void): void;
removeAsync(query: any, options: Nedb.RemoveOptions): Promise<number>;
addListener(event: "compaction.done", listener: () => void): this;
on(event: "compaction.done", listener: () => void): this;
once(event: "compaction.done", listener: () => void): this;
prependListener(event: "compaction.done", listener: () => void): this;
prependOnceListener(event: "compaction.done", listener: () => void): this;
removeListener(event: "compaction.done", listener: () => void): this;
off(event: "compaction.done", listener: () => void): this;
listeners(event: "compaction.done"): Array<() => void>;
rawListeners(event: "compaction.done"): Array<() => void>;
listenerCount(type: "compaction.done"): number;
}
declare namespace Nedb {
interface Cursor<T> extends Promise<Document<T>> {
sort(query: any): Cursor<T>;
skip(n: number): Cursor<T>;
limit(n: number): Cursor<T>;
projection(query: any): Cursor<T>;
exec(callback: (err: Error | null, documents: Document<T>[]) => void): void;
execAsync(): Promise<Document<T>>;
}
interface CursorCount {
exec(callback: (err: Error | null, count: number) => void): void;
}
interface DataStoreOptions {
filename?: string;
timestampData?: boolean;
inMemoryOnly?: boolean;
autoload?: boolean;
onload?(error: Error | null): any;
beforeDeserialization?(line: string): string;
afterSerialization?(line: string): string;
corruptAlertThreshold?: number;
compareStrings?(a: string, b: string): number;
modes?: { fileMode: number; dirMode: number };
testSerializationHooks?: boolean;
}
interface UpdateOptions {
multi?: boolean;
upsert?: boolean;
returnUpdatedDocs?: boolean;
}
interface RemoveOptions {
multi?: boolean;
}
interface EnsureIndexOptions {
fieldName: string | string[];
unique?: boolean;
sparse?: boolean;
expireAfterSeconds?: number;
}
interface Persistence {
/** @deprecated */
compactDatafile(): void;
/** @deprecated */
compactDatafileAsync(): Promise<void>;
/** @deprecated */
setAutocompactionInterval(interval: number): void;
/** @deprecated */
stopAutocompaction(): void;
}
}

3
node_modules/@seald-io/nedb/index.js generated vendored Executable file
View File

@ -0,0 +1,3 @@
const Datastore = require('./lib/datastore')
module.exports = Datastore

118
node_modules/@seald-io/nedb/lib/byline.js generated vendored Normal file
View File

@ -0,0 +1,118 @@
// Forked from https://github.com/jahewson/node-byline
// Copyright (C) 2011-2015 John Hewson
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to
// deal in the Software without restriction, including without limitation the
// rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
// sell copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
// IN THE SOFTWARE.
/**
* @module byline
* @private
*/
const stream = require('stream')
const timers = require('timers')
const { Buffer } = require('buffer')
const createLineStream = (readStream, options) => {
if (!readStream) throw new Error('expected readStream')
if (!readStream.readable) throw new Error('readStream must be readable')
const ls = new LineStream(options)
readStream.pipe(ls)
return ls
}
/**
* Fork from {@link https://github.com/jahewson/node-byline}.
* @see https://github.com/jahewson/node-byline
* @alias module:byline.LineStream
* @private
*/
class LineStream extends stream.Transform {
constructor (options) {
super(options)
options = options || {}
// use objectMode to stop the output from being buffered
// which re-concatanates the lines, just without newlines.
this._readableState.objectMode = true
this._lineBuffer = []
this._keepEmptyLines = options.keepEmptyLines || false
this._lastChunkEndedWithCR = false
// take the source's encoding if we don't have one
this.once('pipe', src => {
if (!this.encoding && src instanceof stream.Readable) this.encoding = src._readableState.encoding // but we can't do this for old-style streams
})
}
_transform (chunk, encoding, done) {
// decode binary chunks as UTF-8
encoding = encoding || 'utf8'
if (Buffer.isBuffer(chunk)) {
if (encoding === 'buffer') {
chunk = chunk.toString() // utf8
encoding = 'utf8'
} else chunk = chunk.toString(encoding)
}
this._chunkEncoding = encoding
// see: http://www.unicode.org/reports/tr18/#Line_Boundaries
const lines = chunk.split(/\r\n|[\n\v\f\r\x85\u2028\u2029]/g)
// don't split CRLF which spans chunks
if (this._lastChunkEndedWithCR && chunk[0] === '\n') lines.shift()
if (this._lineBuffer.length > 0) {
this._lineBuffer[this._lineBuffer.length - 1] += lines[0]
lines.shift()
}
this._lastChunkEndedWithCR = chunk[chunk.length - 1] === '\r'
this._lineBuffer = this._lineBuffer.concat(lines)
this._pushBuffer(encoding, 1, done)
}
_pushBuffer (encoding, keep, done) {
// always buffer the last (possibly partial) line
while (this._lineBuffer.length > keep) {
const line = this._lineBuffer.shift()
// skip empty lines
if (this._keepEmptyLines || line.length > 0) {
if (!this.push(this._reencode(line, encoding))) {
// when the high-water mark is reached, defer pushes until the next tick
timers.setImmediate(() => { this._pushBuffer(encoding, keep, done) })
return
}
}
}
done()
}
_flush (done) {
this._pushBuffer(this._chunkEncoding, 0, done)
}
// see Readable::push
_reencode (line, chunkEncoding) {
if (this.encoding && this.encoding !== chunkEncoding) return Buffer.from(line, chunkEncoding).toString(this.encoding)
else if (this.encoding) return line // this should be the most common case, i.e. we're using an encoded source stream
else return Buffer.from(line, chunkEncoding)
}
}
module.exports = createLineStream

250
node_modules/@seald-io/nedb/lib/cursor.js generated vendored Executable file
View File

@ -0,0 +1,250 @@
const model = require('./model.js')
const { callbackify } = require('util')
/**
* Has a callback
* @callback Cursor~mapFn
* @param {document[]} res
* @return {*|Promise<*>}
*/
/**
* Manage access to data, be it to find, update or remove it.
*
* It extends `Promise` so that its methods (which return `this`) are chainable & awaitable.
* @extends Promise
*/
class Cursor {
/**
* Create a new cursor for this collection.
* @param {Datastore} db - The datastore this cursor is bound to
* @param {query} query - The query this cursor will operate on
* @param {Cursor~mapFn} [mapFn] - Handler to be executed after cursor has found the results and before the callback passed to find/findOne/update/remove
*/
constructor (db, query, mapFn) {
/**
* @protected
* @type {Datastore}
*/
this.db = db
/**
* @protected
* @type {query}
*/
this.query = query || {}
/**
* The handler to be executed after cursor has found the results.
* @type {Cursor~mapFn}
* @protected
*/
if (mapFn) this.mapFn = mapFn
/**
* @see Cursor#limit
* @type {undefined|number}
* @private
*/
this._limit = undefined
/**
* @see Cursor#skip
* @type {undefined|number}
* @private
*/
this._skip = undefined
/**
* @see Cursor#sort
* @type {undefined|Object.<string, number>}
* @private
*/
this._sort = undefined
/**
* @see Cursor#projection
* @type {undefined|Object.<string, number>}
* @private
*/
this._projection = undefined
}
/**
* Set a limit to the number of results for the given Cursor.
* @param {Number} limit
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/
limit (limit) {
this._limit = limit
return this
}
/**
* Skip a number of results for the given Cursor.
* @param {Number} skip
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/
skip (skip) {
this._skip = skip
return this
}
/**
* Sort results of the query for the given Cursor.
* @param {Object.<string, number>} sortQuery - sortQuery is { field: order }, field can use the dot-notation, order is 1 for ascending and -1 for descending
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/
sort (sortQuery) {
this._sort = sortQuery
return this
}
/**
* Add the use of a projection to the given Cursor.
* @param {Object.<string, number>} projection - MongoDB-style projection. {} means take all fields. Then it's { key1: 1, key2: 1 } to take only key1 and key2
* { key1: 0, key2: 0 } to omit only key1 and key2. Except _id, you can't mix takes and omits.
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/
projection (projection) {
this._projection = projection
return this
}
/**
* Apply the projection.
*
* This is an internal function. You should use {@link Cursor#execAsync} or {@link Cursor#exec}.
* @param {document[]} candidates
* @return {document[]}
* @private
*/
_project (candidates) {
const res = []
let action
if (this._projection === undefined || Object.keys(this._projection).length === 0) {
return candidates
}
const keepId = this._projection._id !== 0
const { _id, ...rest } = this._projection
this._projection = rest
// Check for consistency
const keys = Object.keys(this._projection)
keys.forEach(k => {
if (action !== undefined && this._projection[k] !== action) throw new Error('Can\'t both keep and omit fields except for _id')
action = this._projection[k]
})
// Do the actual projection
candidates.forEach(candidate => {
let toPush
if (action === 1) { // pick-type projection
toPush = { $set: {} }
keys.forEach(k => {
toPush.$set[k] = model.getDotValue(candidate, k)
if (toPush.$set[k] === undefined) delete toPush.$set[k]
})
toPush = model.modify({}, toPush)
} else { // omit-type projection
toPush = { $unset: {} }
keys.forEach(k => { toPush.$unset[k] = true })
toPush = model.modify(candidate, toPush)
}
if (keepId) toPush._id = candidate._id
else delete toPush._id
res.push(toPush)
})
return res
}
/**
* Get all matching elements
* Will return pointers to matched elements (shallow copies), returning full copies is the role of find or findOne
* This is an internal function, use execAsync which uses the executor
* @return {document[]|Promise<*>}
* @private
*/
async _execAsync () {
let res = []
let added = 0
let skipped = 0
const candidates = await this.db._getCandidatesAsync(this.query)
for (const candidate of candidates) {
if (model.match(candidate, this.query)) {
// If a sort is defined, wait for the results to be sorted before applying limit and skip
if (!this._sort) {
if (this._skip && this._skip > skipped) skipped += 1
else {
res.push(candidate)
added += 1
if (this._limit && this._limit <= added) break
}
} else res.push(candidate)
}
}
// Apply all sorts
if (this._sort) {
// Sorting
const criteria = Object.entries(this._sort).map(([key, direction]) => ({ key, direction }))
res.sort((a, b) => {
for (const criterion of criteria) {
const compare = criterion.direction * model.compareThings(model.getDotValue(a, criterion.key), model.getDotValue(b, criterion.key), this.db.compareStrings)
if (compare !== 0) return compare
}
return 0
})
// Applying limit and skip
const limit = this._limit || res.length
const skip = this._skip || 0
res = res.slice(skip, skip + limit)
}
// Apply projection
res = this._project(res)
if (this.mapFn) return this.mapFn(res)
return res
}
/**
* @callback Cursor~execCallback
* @param {Error} err
* @param {document[]|*} res If a mapFn was given to the Cursor, then the type of this parameter is the one returned by the mapFn.
*/
/**
* Callback version of {@link Cursor#exec}.
* @param {Cursor~execCallback} _callback
* @see Cursor#execAsync
*/
exec (_callback) {
callbackify(() => this.execAsync())(_callback)
}
/**
* Get all matching elements.
* Will return pointers to matched elements (shallow copies), returning full copies is the role of {@link Datastore#findAsync} or {@link Datastore#findOneAsync}.
* @return {Promise<document[]|*>}
* @async
*/
execAsync () {
return this.db.executor.pushAsync(() => this._execAsync())
}
then (onFulfilled, onRejected) {
return this.execAsync().then(onFulfilled, onRejected)
}
catch (onRejected) {
return this.execAsync().catch(onRejected)
}
finally (onFinally) {
return this.execAsync().finally(onFinally)
}
}
// Interface
module.exports = Cursor

26
node_modules/@seald-io/nedb/lib/customUtils.js generated vendored Executable file
View File

@ -0,0 +1,26 @@
/**
* Utility functions that need to be reimplemented for each environment.
* This is the version for Node.js
* @module customUtilsNode
* @private
*/
const crypto = require('crypto')
/**
* Return a random alphanumerical string of length len
* There is a very small probability (less than 1/1,000,000) for the length to be less than len
* (il the base64 conversion yields too many pluses and slashes) but
* that's not an issue here
* The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision)
* See http://en.wikipedia.org/wiki/Birthday_problem
* @param {number} len
* @return {string}
* @alias module:customUtilsNode.uid
*/
const uid = len => crypto.randomBytes(Math.ceil(Math.max(8, len * 2)))
.toString('base64')
.replace(/[+/]/g, '')
.slice(0, len)
// Interface
module.exports.uid = uid

1137
node_modules/@seald-io/nedb/lib/datastore.js generated vendored Executable file

File diff suppressed because it is too large Load Diff

79
node_modules/@seald-io/nedb/lib/executor.js generated vendored Executable file
View File

@ -0,0 +1,79 @@
const Waterfall = require('./waterfall')
/**
* Executes operations sequentially.
* Has an option for a buffer that can be triggered afterwards.
* @private
*/
class Executor {
/**
* Instantiates a new Executor.
*/
constructor () {
/**
* If this.ready is `false`, then every task pushed will be buffered until this.processBuffer is called.
* @type {boolean}
* @private
*/
this.ready = false
/**
* The main queue
* @type {Waterfall}
* @private
*/
this.queue = new Waterfall()
/**
* The buffer queue
* @type {Waterfall}
* @private
*/
this.buffer = null
/**
* Method to trigger the buffer processing.
*
* Do not be use directly, use `this.processBuffer` instead.
* @function
* @private
*/
this._triggerBuffer = null
this.resetBuffer()
}
/**
* If executor is ready, queue task (and process it immediately if executor was idle)
* If not, buffer task for later processing
* @param {AsyncFunction} task Function to execute
* @param {boolean} [forceQueuing = false] Optional (defaults to false) force executor to queue task even if it is not ready
* @return {Promise<*>}
* @async
* @see Executor#push
*/
pushAsync (task, forceQueuing = false) {
if (this.ready || forceQueuing) return this.queue.waterfall(task)()
else return this.buffer.waterfall(task)()
}
/**
* Queue all tasks in buffer (in the same order they came in)
* Automatically sets executor as ready
*/
processBuffer () {
this.ready = true
this._triggerBuffer()
this.queue.waterfall(() => this.buffer.guardian)
}
/**
* Removes all tasks queued up in the buffer
*/
resetBuffer () {
this.buffer = new Waterfall()
this.buffer.chain(new Promise(resolve => {
this._triggerBuffer = resolve
}))
if (this.ready) this._triggerBuffer()
}
}
// Interface
module.exports = Executor

333
node_modules/@seald-io/nedb/lib/indexes.js generated vendored Executable file
View File

@ -0,0 +1,333 @@
const BinarySearchTree = require('@seald-io/binary-search-tree').AVLTree
const model = require('./model.js')
const { uniq, isDate } = require('./utils.js')
/**
* Two indexed pointers are equal if they point to the same place
* @param {*} a
* @param {*} b
* @return {boolean}
* @private
*/
const checkValueEquality = (a, b) => a === b
/**
* Type-aware projection
* @param {*} elt
* @return {string|*}
* @private
*/
const projectForUnique = elt => {
if (elt === null) return '$null'
if (typeof elt === 'string') return '$string' + elt
if (typeof elt === 'boolean') return '$boolean' + elt
if (typeof elt === 'number') return '$number' + elt
if (isDate(elt)) return '$date' + elt.getTime()
return elt // Arrays and objects, will check for pointer equality
}
/**
* Indexes on field names, with atomic operations and which can optionally enforce a unique constraint or allow indexed
* fields to be undefined
* @private
*/
class Index {
/**
* Create a new index
* All methods on an index guarantee that either the whole operation was successful and the index changed
* or the operation was unsuccessful and an error is thrown while the index is unchanged
* @param {object} options
* @param {string} options.fieldName On which field should the index apply, can use dot notation to index on sub fields, can use comma-separated notation to use compound indexes
* @param {boolean} [options.unique = false] Enforces a unique constraint
* @param {boolean} [options.sparse = false] Allows a sparse index (we can have documents for which fieldName is `undefined`)
*/
constructor (options) {
/**
* On which field the index applies to, can use dot notation to index on sub fields, can use comma-separated notation to use compound indexes.
* @type {string}
*/
this.fieldName = options.fieldName
if (typeof this.fieldName !== 'string') throw new Error('fieldName must be a string')
/**
* Internal property which is an Array representing the fieldName split with `,`, useful only for compound indexes.
* @type {string[]}
* @private
*/
this._fields = this.fieldName.split(',')
/**
* Defines if the index enforces a unique constraint for this index.
* @type {boolean}
*/
this.unique = options.unique || false
/**
* Defines if we can have documents for which fieldName is `undefined`
* @type {boolean}
*/
this.sparse = options.sparse || false
/**
* Options object given to the underlying BinarySearchTree.
* @type {{unique: boolean, checkValueEquality: (function(*, *): boolean), compareKeys: ((function(*, *, compareStrings): (number|number))|*)}}
*/
this.treeOptions = { unique: this.unique, compareKeys: model.compareThings, checkValueEquality }
/**
* Underlying BinarySearchTree for this index. Uses an AVLTree for optimization.
* @type {AVLTree}
*/
this.tree = new BinarySearchTree(this.treeOptions)
}
/**
* Reset an index
* @param {?document|?document[]} [newData] Data to initialize the index with. If an error is thrown during
* insertion, the index is not modified.
*/
reset (newData) {
this.tree = new BinarySearchTree(this.treeOptions)
if (newData) this.insert(newData)
}
/**
* Insert a new document in the index
* If an array is passed, we insert all its elements (if one insertion fails the index is not modified)
* O(log(n))
* @param {document|document[]} doc The document, or array of documents, to insert.
*/
insert (doc) {
let keys
let failingIndex
let error
if (Array.isArray(doc)) {
this.insertMultipleDocs(doc)
return
}
const key = model.getDotValues(doc, this._fields)
// We don't index documents that don't contain the field if the index is sparse
if ((key === undefined || (typeof key === 'object' && key !== null && Object.values(key).every(el => el === undefined))) && this.sparse) return
if (!Array.isArray(key)) this.tree.insert(key, doc)
else {
// If an insert fails due to a unique constraint, roll back all inserts before it
keys = uniq(key, projectForUnique)
for (let i = 0; i < keys.length; i += 1) {
try {
this.tree.insert(keys[i], doc)
} catch (e) {
error = e
failingIndex = i
break
}
}
if (error) {
for (let i = 0; i < failingIndex; i += 1) {
this.tree.delete(keys[i], doc)
}
throw error
}
}
}
/**
* Insert an array of documents in the index
* If a constraint is violated, the changes should be rolled back and an error thrown
* @param {document[]} docs Array of documents to insert.
* @private
*/
insertMultipleDocs (docs) {
let error
let failingIndex
for (let i = 0; i < docs.length; i += 1) {
try {
this.insert(docs[i])
} catch (e) {
error = e
failingIndex = i
break
}
}
if (error) {
for (let i = 0; i < failingIndex; i += 1) {
this.remove(docs[i])
}
throw error
}
}
/**
* Removes a document from the index.
* If an array is passed, we remove all its elements
* The remove operation is safe with regards to the 'unique' constraint
* O(log(n))
* @param {document[]|document} doc The document, or Array of documents, to remove.
*/
remove (doc) {
if (Array.isArray(doc)) {
doc.forEach(d => { this.remove(d) })
return
}
const key = model.getDotValues(doc, this._fields)
if (key === undefined && this.sparse) return
if (!Array.isArray(key)) {
this.tree.delete(key, doc)
} else {
uniq(key, projectForUnique).forEach(_key => {
this.tree.delete(_key, doc)
})
}
}
/**
* Update a document in the index
* If a constraint is violated, changes are rolled back and an error thrown
* Naive implementation, still in O(log(n))
* @param {document|Array.<{oldDoc: document, newDoc: document}>} oldDoc Document to update, or an `Array` of
* `{oldDoc, newDoc}` pairs.
* @param {document} [newDoc] Document to replace the oldDoc with. If the first argument is an `Array` of
* `{oldDoc, newDoc}` pairs, this second argument is ignored.
*/
update (oldDoc, newDoc) {
if (Array.isArray(oldDoc)) {
this.updateMultipleDocs(oldDoc)
return
}
this.remove(oldDoc)
try {
this.insert(newDoc)
} catch (e) {
this.insert(oldDoc)
throw e
}
}
/**
* Update multiple documents in the index
* If a constraint is violated, the changes need to be rolled back
* and an error thrown
* @param {Array.<{oldDoc: document, newDoc: document}>} pairs
*
* @private
*/
updateMultipleDocs (pairs) {
let failingIndex
let error
for (let i = 0; i < pairs.length; i += 1) {
this.remove(pairs[i].oldDoc)
}
for (let i = 0; i < pairs.length; i += 1) {
try {
this.insert(pairs[i].newDoc)
} catch (e) {
error = e
failingIndex = i
break
}
}
// If an error was raised, roll back changes in the inverse order
if (error) {
for (let i = 0; i < failingIndex; i += 1) {
this.remove(pairs[i].newDoc)
}
for (let i = 0; i < pairs.length; i += 1) {
this.insert(pairs[i].oldDoc)
}
throw error
}
}
/**
* Revert an update
* @param {document|Array.<{oldDoc: document, newDoc: document}>} oldDoc Document to revert to, or an `Array` of `{oldDoc, newDoc}` pairs.
* @param {document} [newDoc] Document to revert from. If the first argument is an Array of {oldDoc, newDoc}, this second argument is ignored.
*/
revertUpdate (oldDoc, newDoc) {
const revert = []
if (!Array.isArray(oldDoc)) this.update(newDoc, oldDoc)
else {
oldDoc.forEach(pair => {
revert.push({ oldDoc: pair.newDoc, newDoc: pair.oldDoc })
})
this.update(revert)
}
}
/**
* Get all documents in index whose key match value (if it is a Thing) or one of the elements of value (if it is an array of Things)
* @param {Array.<*>|*} value Value to match the key against
* @return {document[]}
*/
getMatching (value) {
if (!Array.isArray(value)) return this.tree.search(value)
else {
const _res = {}
const res = []
value.forEach(v => {
this.getMatching(v).forEach(doc => {
_res[doc._id] = doc
})
})
Object.keys(_res).forEach(_id => {
res.push(_res[_id])
})
return res
}
}
/**
* Get all documents in index whose key is between bounds are they are defined by query
* Documents are sorted by key
* @param {object} query An object with at least one matcher among $gt, $gte, $lt, $lte.
* @param {*} [query.$gt] Greater than matcher.
* @param {*} [query.$gte] Greater than or equal matcher.
* @param {*} [query.$lt] Lower than matcher.
* @param {*} [query.$lte] Lower than or equal matcher.
* @return {document[]}
*/
getBetweenBounds (query) {
return this.tree.betweenBounds(query)
}
/**
* Get all elements in the index
* @return {document[]}
*/
getAll () {
const res = []
this.tree.executeOnEveryNode(node => {
res.push(...node.data)
})
return res
}
}
// Interface
module.exports = Index

827
node_modules/@seald-io/nedb/lib/model.js generated vendored Executable file
View File

@ -0,0 +1,827 @@
/**
* Handle models (i.e. docs)
* Serialization/deserialization
* Copying
* Querying, update
* @module model
* @private
*/
const { uniq, isDate, isRegExp } = require('./utils.js')
/**
* Check a key, throw an error if the key is non valid
* @param {string} k key
* @param {document} v value, needed to treat the Date edge case
* Non-treatable edge cases here: if part of the object if of the form { $$date: number } or { $$deleted: true }
* Its serialized-then-deserialized version it will transformed into a Date object
* But you really need to want it to trigger such behaviour, even when warned not to use '$' at the beginning of the field names...
* @private
*/
const checkKey = (k, v) => {
if (typeof k === 'number') k = k.toString()
if (
k[0] === '$' &&
!(k === '$$date' && typeof v === 'number') &&
!(k === '$$deleted' && v === true) &&
!(k === '$$indexCreated') &&
!(k === '$$indexRemoved')
) throw new Error('Field names cannot begin with the $ character')
if (k.indexOf('.') !== -1) throw new Error('Field names cannot contain a .')
}
/**
* Check a DB object and throw an error if it's not valid
* Works by applying the above checkKey function to all fields recursively
* @param {document|document[]} obj
* @alias module:model.checkObject
*/
const checkObject = obj => {
if (Array.isArray(obj)) {
obj.forEach(o => {
checkObject(o)
})
}
if (typeof obj === 'object' && obj !== null) {
for (const k in obj) {
if (Object.prototype.hasOwnProperty.call(obj, k)) {
checkKey(k, obj[k])
checkObject(obj[k])
}
}
}
}
/**
* Serialize an object to be persisted to a one-line string
* For serialization/deserialization, we use the native JSON parser and not eval or Function
* That gives us less freedom but data entered in the database may come from users
* so eval and the like are not safe
* Accepted primitive types: Number, String, Boolean, Date, null
* Accepted secondary types: Objects, Arrays
* @param {document} obj
* @return {string}
* @alias module:model.serialize
*/
const serialize = obj => {
return JSON.stringify(obj, function (k, v) {
checkKey(k, v)
if (v === undefined) return undefined
if (v === null) return null
// Hackish way of checking if object is Date (this way it works between execution contexts in node-webkit).
// We can't use value directly because for dates it is already string in this function (date.toJSON was already called), so we use this
if (typeof this[k].getTime === 'function') return { $$date: this[k].getTime() }
return v
})
}
/**
* From a one-line representation of an object generate by the serialize function
* Return the object itself
* @param {string} rawData
* @return {document}
* @alias module:model.deserialize
*/
const deserialize = rawData => JSON.parse(rawData, function (k, v) {
if (k === '$$date') return new Date(v)
if (
typeof v === 'string' ||
typeof v === 'number' ||
typeof v === 'boolean' ||
v === null
) return v
if (v && v.$$date) return v.$$date
return v
})
/**
* Deep copy a DB object
* The optional strictKeys flag (defaulting to false) indicates whether to copy everything or only fields
* where the keys are valid, i.e. don't begin with $ and don't contain a .
* @param {?document} obj
* @param {boolean} [strictKeys=false]
* @return {?document}
* @alias module:modelel:(.*)
*/
function deepCopy (obj, strictKeys) {
if (
typeof obj === 'boolean' ||
typeof obj === 'number' ||
typeof obj === 'string' ||
obj === null ||
(isDate(obj))
) return obj
if (Array.isArray(obj)) return obj.map(o => deepCopy(o, strictKeys))
if (typeof obj === 'object') {
const res = {}
for (const k in obj) {
if (
Object.prototype.hasOwnProperty.call(obj, k) &&
(!strictKeys || (k[0] !== '$' && k.indexOf('.') === -1))
) {
res[k] = deepCopy(obj[k], strictKeys)
}
}
return res
}
return undefined // For now everything else is undefined. We should probably throw an error instead
}
/**
* Tells if an object is a primitive type or a "real" object
* Arrays are considered primitive
* @param {*} obj
* @return {boolean}
* @alias module:modelel:(.*)
*/
const isPrimitiveType = obj => (
typeof obj === 'boolean' ||
typeof obj === 'number' ||
typeof obj === 'string' ||
obj === null ||
isDate(obj) ||
Array.isArray(obj)
)
/**
* Utility functions for comparing things
* Assumes type checking was already done (a and b already have the same type)
* compareNSB works for numbers, strings and booleans
* @param {number|string|boolean} a
* @param {number|string|boolean} b
* @return {number} 0 if a == b, 1 i a > b, -1 if a < b
* @private
*/
const compareNSB = (a, b) => {
if (a < b) return -1
if (a > b) return 1
return 0
}
/**
* Utility function for comparing array
* Assumes type checking was already done (a and b already have the same type)
* compareNSB works for numbers, strings and booleans
* @param {Array} a
* @param {Array} b
* @return {number} 0 if arrays have the same length and all elements equal one another. Else either 1 or -1.
* @private
*/
const compareArrays = (a, b) => {
const minLength = Math.min(a.length, b.length)
for (let i = 0; i < minLength; i += 1) {
const comp = compareThings(a[i], b[i])
if (comp !== 0) return comp
}
// Common section was identical, longest one wins
return compareNSB(a.length, b.length)
}
/**
* Compare { things U undefined }
* Things are defined as any native types (string, number, boolean, null, date) and objects
* We need to compare with undefined as it will be used in indexes
* In the case of objects and arrays, we deep-compare
* If two objects dont have the same type, the (arbitrary) type hierarchy is: undefined, null, number, strings, boolean, dates, arrays, objects
* Return -1 if a < b, 1 if a > b and 0 if a = b (note that equality here is NOT the same as defined in areThingsEqual!)
* @param {*} a
* @param {*} b
* @param {compareStrings} [_compareStrings] String comparing function, returning -1, 0 or 1, overriding default string comparison (useful for languages with accented letters)
* @return {number}
* @alias module:model.compareThings
*/
const compareThings = (a, b, _compareStrings) => {
const compareStrings = _compareStrings || compareNSB
// undefined
if (a === undefined) return b === undefined ? 0 : -1
if (b === undefined) return 1 // no need to test if a === undefined
// null
if (a === null) return b === null ? 0 : -1
if (b === null) return 1 // no need to test if a === null
// Numbers
if (typeof a === 'number') return typeof b === 'number' ? compareNSB(a, b) : -1
if (typeof b === 'number') return typeof a === 'number' ? compareNSB(a, b) : 1
// Strings
if (typeof a === 'string') return typeof b === 'string' ? compareStrings(a, b) : -1
if (typeof b === 'string') return typeof a === 'string' ? compareStrings(a, b) : 1
// Booleans
if (typeof a === 'boolean') return typeof b === 'boolean' ? compareNSB(a, b) : -1
if (typeof b === 'boolean') return typeof a === 'boolean' ? compareNSB(a, b) : 1
// Dates
if (isDate(a)) return isDate(b) ? compareNSB(a.getTime(), b.getTime()) : -1
if (isDate(b)) return isDate(a) ? compareNSB(a.getTime(), b.getTime()) : 1
// Arrays (first element is most significant and so on)
if (Array.isArray(a)) return Array.isArray(b) ? compareArrays(a, b) : -1
if (Array.isArray(b)) return Array.isArray(a) ? compareArrays(a, b) : 1
// Objects
const aKeys = Object.keys(a).sort()
const bKeys = Object.keys(b).sort()
for (let i = 0; i < Math.min(aKeys.length, bKeys.length); i += 1) {
const comp = compareThings(a[aKeys[i]], b[bKeys[i]])
if (comp !== 0) return comp
}
return compareNSB(aKeys.length, bKeys.length)
}
// ==============================================================
// Updating documents
// ==============================================================
/**
* @callback modifierFunction
* The signature of modifier functions is as follows
* Their structure is always the same: recursively follow the dot notation while creating
* the nested documents if needed, then apply the "last step modifier"
* @param {Object} obj The model to modify
* @param {String} field Can contain dots, in that case that means we will set a subfield recursively
* @param {document} value
*/
/**
* Create the complete modifier function
* @param {modifierFunction} lastStepModifierFunction a lastStepModifierFunction
* @param {boolean} [unset = false] Bad looking specific fix, needs to be generalized modifiers that behave like $unset are implemented
* @return {modifierFunction}
* @private
*/
const createModifierFunction = (lastStepModifierFunction, unset = false) => (obj, field, value) => {
const func = (obj, field, value) => {
const fieldParts = typeof field === 'string' ? field.split('.') : field
if (fieldParts.length === 1) lastStepModifierFunction(obj, field, value)
else {
if (obj[fieldParts[0]] === undefined) {
if (unset) return
obj[fieldParts[0]] = {}
}
func(obj[fieldParts[0]], fieldParts.slice(1), value)
}
}
return func(obj, field, value)
}
const $addToSetPartial = (obj, field, value) => {
// Create the array if it doesn't exist
if (!Object.prototype.hasOwnProperty.call(obj, field)) { obj[field] = [] }
if (!Array.isArray(obj[field])) throw new Error('Can\'t $addToSet an element on non-array values')
if (value !== null && typeof value === 'object' && value.$each) {
if (Object.keys(value).length > 1) throw new Error('Can\'t use another field in conjunction with $each')
if (!Array.isArray(value.$each)) throw new Error('$each requires an array value')
value.$each.forEach(v => {
$addToSetPartial(obj, field, v)
})
} else {
let addToSet = true
obj[field].forEach(v => {
if (compareThings(v, value) === 0) addToSet = false
})
if (addToSet) obj[field].push(value)
}
}
/**
* @enum {modifierFunction}
*/
const modifierFunctions = {
/**
* Set a field to a new value
*/
$set: createModifierFunction((obj, field, value) => {
obj[field] = value
}),
/**
* Unset a field
*/
$unset: createModifierFunction((obj, field, value) => {
delete obj[field]
}, true),
/**
* Updates the value of the field, only if specified field is smaller than the current value of the field
*/
$min: createModifierFunction((obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value < obj[field]) obj[field] = value
}),
/**
* Updates the value of the field, only if specified field is greater than the current value of the field
*/
$max: createModifierFunction((obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value > obj[field]) obj[field] = value
}),
/**
* Increment a numeric field's value
*/
$inc: createModifierFunction((obj, field, value) => {
if (typeof value !== 'number') throw new Error(`${value} must be a number`)
if (typeof obj[field] !== 'number') {
if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = value
else throw new Error('Don\'t use the $inc modifier on non-number fields')
} else obj[field] += value
}),
/**
* Removes all instances of a value from an existing array
*/
$pull: createModifierFunction((obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pull an element from non-array values')
const arr = obj[field]
for (let i = arr.length - 1; i >= 0; i -= 1) {
if (match(arr[i], value)) arr.splice(i, 1)
}
}),
/**
* Remove the first or last element of an array
*/
$pop: createModifierFunction((obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pop an element from non-array values')
if (typeof value !== 'number') throw new Error(`${value} isn't an integer, can't use it with $pop`)
if (value === 0) return
if (value > 0) obj[field] = obj[field].slice(0, obj[field].length - 1)
else obj[field] = obj[field].slice(1)
}),
/**
* Add an element to an array field only if it is not already in it
* No modification if the element is already in the array
* Note that it doesn't check whether the original array contains duplicates
*/
$addToSet: createModifierFunction($addToSetPartial),
/**
* Push an element to the end of an array field
* Optional modifier $each instead of value to push several values
* Optional modifier $slice to slice the resulting array, see https://docs.mongodb.org/manual/reference/operator/update/slice/
* Difference with MongoDB: if $slice is specified and not $each, we act as if value is an empty array
*/
$push: createModifierFunction((obj, field, value) => {
// Create the array if it doesn't exist
if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = []
if (!Array.isArray(obj[field])) throw new Error('Can\'t $push an element on non-array values')
if (
value !== null &&
typeof value === 'object' &&
value.$slice &&
value.$each === undefined
) value.$each = []
if (value !== null && typeof value === 'object' && value.$each) {
if (
Object.keys(value).length >= 3 ||
(Object.keys(value).length === 2 && value.$slice === undefined)
) throw new Error('Can only use $slice in cunjunction with $each when $push to array')
if (!Array.isArray(value.$each)) throw new Error('$each requires an array value')
value.$each.forEach(v => {
obj[field].push(v)
})
if (value.$slice === undefined || typeof value.$slice !== 'number') return
if (value.$slice === 0) obj[field] = []
else {
let start
let end
const n = obj[field].length
if (value.$slice < 0) {
start = Math.max(0, n + value.$slice)
end = n
} else if (value.$slice > 0) {
start = 0
end = Math.min(n, value.$slice)
}
obj[field] = obj[field].slice(start, end)
}
} else {
obj[field].push(value)
}
})
}
/**
* Modify a DB object according to an update query
* @param {document} obj
* @param {query} updateQuery
* @return {document}
* @alias module:model.modify
*/
const modify = (obj, updateQuery) => {
const keys = Object.keys(updateQuery)
const firstChars = keys.map(item => item[0])
const dollarFirstChars = firstChars.filter(c => c === '$')
let newDoc
let modifiers
if (keys.indexOf('_id') !== -1 && updateQuery._id !== obj._id) throw new Error('You cannot change a document\'s _id')
if (dollarFirstChars.length !== 0 && dollarFirstChars.length !== firstChars.length) throw new Error('You cannot mix modifiers and normal fields')
if (dollarFirstChars.length === 0) {
// Simply replace the object with the update query contents
newDoc = deepCopy(updateQuery)
newDoc._id = obj._id
} else {
// Apply modifiers
modifiers = uniq(keys)
newDoc = deepCopy(obj)
modifiers.forEach(m => {
if (!modifierFunctions[m]) throw new Error(`Unknown modifier ${m}`)
// Can't rely on Object.keys throwing on non objects since ES6
// Not 100% satisfying as non objects can be interpreted as objects but no false negatives so we can live with it
if (typeof updateQuery[m] !== 'object') throw new Error(`Modifier ${m}'s argument must be an object`)
const keys = Object.keys(updateQuery[m])
keys.forEach(k => {
modifierFunctions[m](newDoc, k, updateQuery[m][k])
})
})
}
// Check result is valid and return it
checkObject(newDoc)
if (obj._id !== newDoc._id) throw new Error('You can\'t change a document\'s _id')
return newDoc
}
// ==============================================================
// Finding documents
// ==============================================================
/**
* Get a value from object with dot notation
* @param {object} obj
* @param {string} field
* @return {*}
* @alias module:model.getDotValue
*/
const getDotValue = (obj, field) => {
const fieldParts = typeof field === 'string' ? field.split('.') : field
if (!obj) return undefined // field cannot be empty so that means we should return undefined so that nothing can match
if (fieldParts.length === 0) return obj
if (fieldParts.length === 1) return obj[fieldParts[0]]
if (Array.isArray(obj[fieldParts[0]])) {
// If the next field is an integer, return only this item of the array
const i = parseInt(fieldParts[1], 10)
if (typeof i === 'number' && !isNaN(i)) return getDotValue(obj[fieldParts[0]][i], fieldParts.slice(2))
// Return the array of values
return obj[fieldParts[0]].map(el => getDotValue(el, fieldParts.slice(1)))
} else return getDotValue(obj[fieldParts[0]], fieldParts.slice(1))
}
/**
* Get dot values for either a bunch of fields or just one.
*/
const getDotValues = (obj, fields) => {
if (!Array.isArray(fields)) throw new Error('fields must be an Array')
if (fields.length > 1) {
const key = {}
for (const field of fields) {
key[field] = getDotValue(obj, field)
}
return key
} else return getDotValue(obj, fields[0])
}
/**
* Check whether 'things' are equal
* Things are defined as any native types (string, number, boolean, null, date) and objects
* In the case of object, we check deep equality
* Returns true if they are, false otherwise
* @param {*} a
* @param {*} a
* @return {boolean}
* @alias module:model.areThingsEqual
*/
const areThingsEqual = (a, b) => {
// Strings, booleans, numbers, null
if (
a === null ||
typeof a === 'string' ||
typeof a === 'boolean' ||
typeof a === 'number' ||
b === null ||
typeof b === 'string' ||
typeof b === 'boolean' ||
typeof b === 'number'
) return a === b
// Dates
if (isDate(a) || isDate(b)) return isDate(a) && isDate(b) && a.getTime() === b.getTime()
// Arrays (no match since arrays are used as a $in)
// undefined (no match since they mean field doesn't exist and can't be serialized)
if (
(!(Array.isArray(a) && Array.isArray(b)) && (Array.isArray(a) || Array.isArray(b))) ||
a === undefined || b === undefined
) return false
// General objects (check for deep equality)
// a and b should be objects at this point
let aKeys
let bKeys
try {
aKeys = Object.keys(a)
bKeys = Object.keys(b)
} catch (e) {
return false
}
if (aKeys.length !== bKeys.length) return false
for (const el of aKeys) {
if (bKeys.indexOf(el) === -1) return false
if (!areThingsEqual(a[el], b[el])) return false
}
return true
}
/**
* Check that two values are comparable
* @param {*} a
* @param {*} a
* @return {boolean}
* @private
*/
const areComparable = (a, b) => {
if (
typeof a !== 'string' &&
typeof a !== 'number' &&
!isDate(a) &&
typeof b !== 'string' &&
typeof b !== 'number' &&
!isDate(b)
) return false
if (typeof a !== typeof b) return false
return true
}
/**
* @callback comparisonOperator
* Arithmetic and comparison operators
* @param {*} a Value in the object
* @param {*} b Value in the query
* @return {boolean}
*/
/**
* @enum {comparisonOperator}
*/
const comparisonFunctions = {
/** Lower than */
$lt: (a, b) => areComparable(a, b) && a < b,
/** Lower than or equals */
$lte: (a, b) => areComparable(a, b) && a <= b,
/** Greater than */
$gt: (a, b) => areComparable(a, b) && a > b,
/** Greater than or equals */
$gte: (a, b) => areComparable(a, b) && a >= b,
/** Does not equal */
$ne: (a, b) => a === undefined || !areThingsEqual(a, b),
/** Is in Array */
$in: (a, b) => {
if (!Array.isArray(b)) throw new Error('$in operator called with a non-array')
for (const el of b) {
if (areThingsEqual(a, el)) return true
}
return false
},
/** Is not in Array */
$nin: (a, b) => {
if (!Array.isArray(b)) throw new Error('$nin operator called with a non-array')
return !comparisonFunctions.$in(a, b)
},
/** Matches Regexp */
$regex: (a, b) => {
if (!isRegExp(b)) throw new Error('$regex operator called with non regular expression')
if (typeof a !== 'string') return false
else return b.test(a)
},
/** Returns true if field exists */
$exists: (a, b) => {
// This will be true for all values of stat except false, null, undefined and 0
// That's strange behaviour (we should only use true/false) but that's the way Mongo does it...
if (b || b === '') b = true
else b = false
if (a === undefined) return !b
else return b
},
/** Specific to Arrays, returns true if a length equals b */
$size: (a, b) => {
if (!Array.isArray(a)) return false
if (b % 1 !== 0) throw new Error('$size operator called without an integer')
return a.length === b
},
/** Specific to Arrays, returns true if some elements of a match the query b */
$elemMatch: (a, b) => {
if (!Array.isArray(a)) return false
return a.some(el => match(el, b))
}
}
const arrayComparisonFunctions = { $size: true, $elemMatch: true }
/**
* @enum
*/
const logicalOperators = {
/**
* Match any of the subqueries
* @param {document} obj
* @param {query[]} query
* @return {boolean}
*/
$or: (obj, query) => {
if (!Array.isArray(query)) throw new Error('$or operator used without an array')
for (let i = 0; i < query.length; i += 1) {
if (match(obj, query[i])) return true
}
return false
},
/**
* Match all of the subqueries
* @param {document} obj
* @param {query[]} query
* @return {boolean}
*/
$and: (obj, query) => {
if (!Array.isArray(query)) throw new Error('$and operator used without an array')
for (let i = 0; i < query.length; i += 1) {
if (!match(obj, query[i])) return false
}
return true
},
/**
* Inverted match of the query
* @param {document} obj
* @param {query} query
* @return {boolean}
*/
$not: (obj, query) => !match(obj, query),
/**
* @callback whereCallback
* @param {document} obj
* @return {boolean}
*/
/**
* Use a function to match
* @param {document} obj
* @param {whereCallback} fn
* @return {boolean}
*/
$where: (obj, fn) => {
if (typeof fn !== 'function') throw new Error('$where operator used without a function')
const result = fn.call(obj)
if (typeof result !== 'boolean') throw new Error('$where function must return boolean')
return result
}
}
/**
* Tell if a given document matches a query
* @param {document} obj Document to check
* @param {query} query
* @return {boolean}
* @alias module:model.match
*/
const match = (obj, query) => {
// Primitive query against a primitive type
// This is a bit of a hack since we construct an object with an arbitrary key only to dereference it later
// But I don't have time for a cleaner implementation now
if (isPrimitiveType(obj) || isPrimitiveType(query)) return matchQueryPart({ needAKey: obj }, 'needAKey', query)
// Normal query
for (const queryKey in query) {
if (Object.prototype.hasOwnProperty.call(query, queryKey)) {
const queryValue = query[queryKey]
if (queryKey[0] === '$') {
if (!logicalOperators[queryKey]) throw new Error(`Unknown logical operator ${queryKey}`)
if (!logicalOperators[queryKey](obj, queryValue)) return false
} else if (!matchQueryPart(obj, queryKey, queryValue)) return false
}
}
return true
}
/**
* Match an object against a specific { key: value } part of a query
* if the treatObjAsValue flag is set, don't try to match every part separately, but the array as a whole
* @param {object} obj
* @param {string} queryKey
* @param {*} queryValue
* @param {boolean} [treatObjAsValue=false]
* @return {boolean}
* @private
*/
function matchQueryPart (obj, queryKey, queryValue, treatObjAsValue) {
const objValue = getDotValue(obj, queryKey)
// Check if the value is an array if we don't force a treatment as value
if (Array.isArray(objValue) && !treatObjAsValue) {
// If the queryValue is an array, try to perform an exact match
if (Array.isArray(queryValue)) return matchQueryPart(obj, queryKey, queryValue, true)
// Check if we are using an array-specific comparison function
if (queryValue !== null && typeof queryValue === 'object' && !isRegExp(queryValue)) {
for (const key in queryValue) {
if (Object.prototype.hasOwnProperty.call(queryValue, key) && arrayComparisonFunctions[key]) { return matchQueryPart(obj, queryKey, queryValue, true) }
}
}
// If not, treat it as an array of { obj, query } where there needs to be at least one match
for (const el of objValue) {
if (matchQueryPart({ k: el }, 'k', queryValue)) return true // k here could be any string
}
return false
}
// queryValue is an actual object. Determine whether it contains comparison operators
// or only normal fields. Mixed objects are not allowed
if (queryValue !== null && typeof queryValue === 'object' && !isRegExp(queryValue) && !Array.isArray(queryValue)) {
const keys = Object.keys(queryValue)
const firstChars = keys.map(item => item[0])
const dollarFirstChars = firstChars.filter(c => c === '$')
if (dollarFirstChars.length !== 0 && dollarFirstChars.length !== firstChars.length) throw new Error('You cannot mix operators and normal fields')
// queryValue is an object of this form: { $comparisonOperator1: value1, ... }
if (dollarFirstChars.length > 0) {
for (const key of keys) {
if (!comparisonFunctions[key]) throw new Error(`Unknown comparison function ${key}`)
if (!comparisonFunctions[key](objValue, queryValue[key])) return false
}
return true
}
}
// Using regular expressions with basic querying
if (isRegExp(queryValue)) return comparisonFunctions.$regex(objValue, queryValue)
// queryValue is either a native value or a normal object
// Basic matching is possible
return areThingsEqual(objValue, queryValue)
}
// Interface
module.exports.serialize = serialize
module.exports.deserialize = deserialize
module.exports.deepCopy = deepCopy
module.exports.checkObject = checkObject
module.exports.isPrimitiveType = isPrimitiveType
module.exports.modify = modify
module.exports.getDotValue = getDotValue
module.exports.getDotValues = getDotValues
module.exports.match = match
module.exports.areThingsEqual = areThingsEqual
module.exports.compareThings = compareThings

380
node_modules/@seald-io/nedb/lib/persistence.js generated vendored Executable file
View File

@ -0,0 +1,380 @@
const { deprecate } = require('util')
const byline = require('./byline')
const customUtils = require('./customUtils.js')
const Index = require('./indexes.js')
const model = require('./model.js')
const storage = require('./storage.js')
const DEFAULT_DIR_MODE = 0o755
const DEFAULT_FILE_MODE = 0o644
/**
* Under the hood, NeDB's persistence uses an append-only format, meaning that all
* updates and deletes actually result in lines added at the end of the datafile,
* for performance reasons. The database is automatically compacted (i.e. put back
* in the one-line-per-document format) every time you load each database within
* your application.
*
* Persistence handles the compaction exposed in the Datastore {@link Datastore#compactDatafileAsync},
* {@link Datastore#setAutocompactionInterval}.
*
* Since version 3.0.0, using {@link Datastore.persistence} methods manually is deprecated.
*
* Compaction takes a bit of time (not too much: 130ms for 50k
* records on a typical development machine) and no other operation can happen when
* it does, so most projects actually don't need to use it.
*
* Compaction will also immediately remove any documents whose data line has become
* corrupted, assuming that the total percentage of all corrupted documents in that
* database still falls below the specified `corruptAlertThreshold` option's value.
*
* Durability works similarly to major databases: compaction forces the OS to
* physically flush data to disk, while appends to the data file do not (the OS is
* responsible for flushing the data). That guarantees that a server crash can
* never cause complete data loss, while preserving performance. The worst that can
* happen is a crash between two syncs, causing a loss of all data between the two
* syncs. Usually syncs are 30 seconds appart so that's at most 30 seconds of
* data. [This post by Antirez on Redis persistence](http://oldblog.antirez.com/post/redis-persistence-demystified.html)
* explains this in more details, NeDB being very close to Redis AOF persistence
* with `appendfsync` option set to `no`.
*/
class Persistence {
/**
* Create a new Persistence object for database options.db
* @param {Datastore} options.db
* @param {Number} [options.corruptAlertThreshold] Optional, threshold after which an alert is thrown if too much data is corrupt
* @param {serializationHook} [options.beforeDeserialization] Hook you can use to transform data after it was serialized and before it is written to disk.
* @param {serializationHook} [options.afterSerialization] Inverse of `afterSerialization`.
* @param {object} [options.modes] Modes to use for FS permissions. Will not work on Windows.
* @param {number} [options.modes.fileMode=0o644] Mode to use for files.
* @param {number} [options.modes.dirMode=0o755] Mode to use for directories.
* @param {boolean} [options.testSerializationHooks=true] Whether to test the serialization hooks or not, might be CPU-intensive
*/
constructor (options) {
this.db = options.db
this.inMemoryOnly = this.db.inMemoryOnly
this.filename = this.db.filename
this.corruptAlertThreshold = options.corruptAlertThreshold !== undefined ? options.corruptAlertThreshold : 0.1
this.modes = options.modes !== undefined ? options.modes : { fileMode: DEFAULT_FILE_MODE, dirMode: DEFAULT_DIR_MODE }
if (this.modes.fileMode === undefined) this.modes.fileMode = DEFAULT_FILE_MODE
if (this.modes.dirMode === undefined) this.modes.dirMode = DEFAULT_DIR_MODE
if (
!this.inMemoryOnly &&
this.filename &&
this.filename.charAt(this.filename.length - 1) === '~'
) throw new Error('The datafile name can\'t end with a ~, which is reserved for crash safe backup files')
// After serialization and before deserialization hooks with some basic sanity checks
if (
options.afterSerialization &&
!options.beforeDeserialization
) throw new Error('Serialization hook defined but deserialization hook undefined, cautiously refusing to start NeDB to prevent dataloss')
if (
!options.afterSerialization &&
options.beforeDeserialization
) throw new Error('Serialization hook undefined but deserialization hook defined, cautiously refusing to start NeDB to prevent dataloss')
this.afterSerialization = options.afterSerialization || (s => s)
this.beforeDeserialization = options.beforeDeserialization || (s => s)
if (options.testSerializationHooks === undefined || options.testSerializationHooks) {
for (let i = 1; i < 30; i += 1) {
for (let j = 0; j < 10; j += 1) {
const randomString = customUtils.uid(i)
if (this.beforeDeserialization(this.afterSerialization(randomString)) !== randomString) {
throw new Error('beforeDeserialization is not the reverse of afterSerialization, cautiously refusing to start NeDB to prevent dataloss')
}
}
}
}
}
/**
* Internal version without using the {@link Datastore#executor} of {@link Datastore#compactDatafileAsync}, use it instead.
* @return {Promise<void>}
* @private
*/
async persistCachedDatabaseAsync () {
const lines = []
if (this.inMemoryOnly) return
this.db.getAllData().forEach(doc => {
lines.push(this.afterSerialization(model.serialize(doc)))
})
Object.keys(this.db.indexes).forEach(fieldName => {
if (fieldName !== '_id') { // The special _id index is managed by datastore.js, the others need to be persisted
lines.push(this.afterSerialization(model.serialize({
$$indexCreated: {
fieldName: this.db.indexes[fieldName].fieldName,
unique: this.db.indexes[fieldName].unique,
sparse: this.db.indexes[fieldName].sparse
}
})))
}
})
await storage.crashSafeWriteFileLinesAsync(this.filename, lines, this.modes)
this.db.emit('compaction.done')
}
/**
* @see Datastore#compactDatafile
* @deprecated
* @param {NoParamCallback} [callback = () => {}]
* @see Persistence#compactDatafileAsync
*/
compactDatafile (callback) {
deprecate(_callback => this.db.compactDatafile(_callback), '@seald-io/nedb: calling Datastore#persistence#compactDatafile is deprecated, please use Datastore#compactDatafile, it will be removed in the next major version.')(callback)
}
/**
* @see Datastore#setAutocompactionInterval
* @deprecated
*/
setAutocompactionInterval (interval) {
deprecate(_interval => this.db.setAutocompactionInterval(_interval), '@seald-io/nedb: calling Datastore#persistence#setAutocompactionInterval is deprecated, please use Datastore#setAutocompactionInterval, it will be removed in the next major version.')(interval)
}
/**
* @see Datastore#stopAutocompaction
* @deprecated
*/
stopAutocompaction () {
deprecate(() => this.db.stopAutocompaction(), '@seald-io/nedb: calling Datastore#persistence#stopAutocompaction is deprecated, please use Datastore#stopAutocompaction, it will be removed in the next major version.')()
}
/**
* Persist new state for the given newDocs (can be insertion, update or removal)
* Use an append-only format
*
* Do not use directly, it should only used by a {@link Datastore} instance.
* @param {document[]} newDocs Can be empty if no doc was updated/removed
* @return {Promise}
* @private
*/
async persistNewStateAsync (newDocs) {
let toPersist = ''
// In-memory only datastore
if (this.inMemoryOnly) return
newDocs.forEach(doc => {
toPersist += this.afterSerialization(model.serialize(doc)) + '\n'
})
if (toPersist.length === 0) return
await storage.appendFileAsync(this.filename, toPersist, { encoding: 'utf8', mode: this.modes.fileMode })
}
/**
* @typedef rawIndex
* @property {string} fieldName
* @property {boolean} [unique]
* @property {boolean} [sparse]
*/
/**
* From a database's raw data, return the corresponding machine understandable collection.
*
* Do not use directly, it should only used by a {@link Datastore} instance.
* @param {string} rawData database file
* @return {{data: document[], indexes: Object.<string, rawIndex>}}
* @private
*/
treatRawData (rawData) {
const data = rawData.split('\n')
const dataById = {}
const indexes = {}
let dataLength = data.length
// Last line of every data file is usually blank so not really corrupt
let corruptItems = 0
for (const datum of data) {
if (datum === '') { dataLength--; continue }
try {
const doc = model.deserialize(this.beforeDeserialization(datum))
if (doc._id) {
if (doc.$$deleted === true) delete dataById[doc._id]
else dataById[doc._id] = doc
} else if (doc.$$indexCreated && doc.$$indexCreated.fieldName != null) indexes[doc.$$indexCreated.fieldName] = doc.$$indexCreated
else if (typeof doc.$$indexRemoved === 'string') delete indexes[doc.$$indexRemoved]
} catch (e) {
corruptItems += 1
}
}
// A bit lenient on corruption
if (dataLength > 0) {
const corruptionRate = corruptItems / dataLength
if (corruptionRate > this.corruptAlertThreshold) {
const error = new Error(`${Math.floor(100 * corruptionRate)}% of the data file is corrupt, more than given corruptAlertThreshold (${Math.floor(100 * this.corruptAlertThreshold)}%). Cautiously refusing to start NeDB to prevent dataloss.`)
error.corruptionRate = corruptionRate
error.corruptItems = corruptItems
error.dataLength = dataLength
throw error
}
}
const tdata = Object.values(dataById)
return { data: tdata, indexes }
}
/**
* From a database's raw data stream, return the corresponding machine understandable collection
* Is only used by a {@link Datastore} instance.
*
* Is only used in the Node.js version, since [React-Native]{@link module:storageReactNative} &
* [browser]{@link module:storageBrowser} storage modules don't provide an equivalent of
* {@link module:storage.readFileStream}.
*
* Do not use directly, it should only used by a {@link Datastore} instance.
* @param {Readable} rawStream
* @return {Promise<{data: document[], indexes: Object.<string, rawIndex>}>}
* @async
* @private
*/
treatRawStreamAsync (rawStream) {
return new Promise((resolve, reject) => {
const dataById = {}
const indexes = {}
let corruptItems = 0
const lineStream = byline(rawStream)
let dataLength = 0
lineStream.on('data', (line) => {
if (line === '') return
try {
const doc = model.deserialize(this.beforeDeserialization(line))
if (doc._id) {
if (doc.$$deleted === true) delete dataById[doc._id]
else dataById[doc._id] = doc
} else if (doc.$$indexCreated && doc.$$indexCreated.fieldName != null) indexes[doc.$$indexCreated.fieldName] = doc.$$indexCreated
else if (typeof doc.$$indexRemoved === 'string') delete indexes[doc.$$indexRemoved]
} catch (e) {
corruptItems += 1
}
dataLength++
})
lineStream.on('end', () => {
// A bit lenient on corruption
if (dataLength > 0) {
const corruptionRate = corruptItems / dataLength
if (corruptionRate > this.corruptAlertThreshold) {
const error = new Error(`${Math.floor(100 * corruptionRate)}% of the data file is corrupt, more than given corruptAlertThreshold (${Math.floor(100 * this.corruptAlertThreshold)}%). Cautiously refusing to start NeDB to prevent dataloss.`)
error.corruptionRate = corruptionRate
error.corruptItems = corruptItems
error.dataLength = dataLength
reject(error, null)
return
}
}
const data = Object.values(dataById)
resolve({ data, indexes })
})
lineStream.on('error', function (err) {
reject(err, null)
})
})
}
/**
* Load the database
* 1) Create all indexes
* 2) Insert all data
* 3) Compact the database
*
* This means pulling data out of the data file or creating it if it doesn't exist
* Also, all data is persisted right away, which has the effect of compacting the database file
* This operation is very quick at startup for a big collection (60ms for ~10k docs)
*
* Do not use directly as it does not use the [Executor]{@link Datastore.executor}, use {@link Datastore#loadDatabaseAsync} instead.
* @return {Promise<void>}
* @private
*/
async loadDatabaseAsync () {
this.db._resetIndexes()
// In-memory only datastore
if (this.inMemoryOnly) return
await Persistence.ensureParentDirectoryExistsAsync(this.filename, this.modes.dirMode)
await storage.ensureDatafileIntegrityAsync(this.filename, this.modes.fileMode)
let treatedData
if (storage.readFileStream) {
// Server side
const fileStream = storage.readFileStream(this.filename, { encoding: 'utf8', mode: this.modes.fileMode })
treatedData = await this.treatRawStreamAsync(fileStream)
} else {
// Browser
const rawData = await storage.readFileAsync(this.filename, { encoding: 'utf8', mode: this.modes.fileMode })
treatedData = this.treatRawData(rawData)
}
// Recreate all indexes in the datafile
Object.keys(treatedData.indexes).forEach(key => {
this.db.indexes[key] = new Index(treatedData.indexes[key])
})
// Fill cached database (i.e. all indexes) with data
try {
this.db._resetIndexes(treatedData.data)
} catch (e) {
this.db._resetIndexes() // Rollback any index which didn't fail
throw e
}
await this.db.persistence.persistCachedDatabaseAsync()
this.db.executor.processBuffer()
}
/**
* See {@link Datastore#dropDatabaseAsync}. This function uses {@link Datastore#executor} internally. Decorating this
* function with an {@link Executor#pushAsync} will result in a deadlock.
* @return {Promise<void>}
* @private
* @see Datastore#dropDatabaseAsync
*/
async dropDatabaseAsync () {
this.db.stopAutocompaction() // stop autocompaction
this.db.executor.ready = false // prevent queuing new tasks
this.db.executor.resetBuffer() // remove pending buffered tasks
await this.db.executor.queue.guardian // wait for the ongoing tasks to end
// remove indexes (which means remove data from memory)
this.db.indexes = {}
// add back _id index, otherwise it will fail
this.db.indexes._id = new Index({ fieldName: '_id', unique: true })
// reset TTL on indexes
this.db.ttlIndexes = {}
// remove datastore file
if (!this.db.inMemoryOnly) {
await this.db.executor.pushAsync(async () => {
if (await storage.existsAsync(this.filename)) await storage.unlinkAsync(this.filename)
}, true)
}
}
/**
* Check if a directory stat and create it on the fly if it is not the case.
* @param {string} dir
* @param {number} [mode=0o777]
* @return {Promise<void>}
* @private
*/
static async ensureParentDirectoryExistsAsync (dir, mode = DEFAULT_DIR_MODE) {
return storage.ensureParentDirectoryExistsAsync(dir, mode)
}
}
// Interface
module.exports = Persistence

317
node_modules/@seald-io/nedb/lib/storage.js generated vendored Executable file
View File

@ -0,0 +1,317 @@
/**
* Way data is stored for this database.
* This version is the Node.js version.
* It's essentially fs, mkdirp and crash safe write and read functions.
*
* @see module:storageBrowser
* @see module:storageReactNative
* @module storage
* @private
*/
const fs = require('fs')
const fsPromises = fs.promises
const path = require('path')
const { Readable } = require('stream')
const DEFAULT_DIR_MODE = 0o755
const DEFAULT_FILE_MODE = 0o644
/**
* Returns true if file exists.
* @param {string} file
* @return {Promise<boolean>}
* @async
* @alias module:storage.existsAsync
* @see module:storage.exists
*/
const existsAsync = file => fsPromises.access(file, fs.constants.F_OK).then(() => true, () => false)
/**
* Node.js' [fsPromises.rename]{@link https://nodejs.org/api/fs.html#fspromisesrenameoldpath-newpath}
* @function
* @param {string} oldPath
* @param {string} newPath
* @return {Promise<void>}
* @alias module:storage.renameAsync
* @async
*/
const renameAsync = fsPromises.rename
/**
* Node.js' [fsPromises.writeFile]{@link https://nodejs.org/api/fs.html#fspromiseswritefilefile-data-options}.
* @function
* @param {string} path
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storage.writeFileAsync
* @async
*/
const writeFileAsync = fsPromises.writeFile
/**
* Node.js' [fs.createWriteStream]{@link https://nodejs.org/api/fs.html#fscreatewritestreampath-options}.
* @function
* @param {string} path
* @param {Object} [options]
* @return {fs.WriteStream}
* @alias module:storage.writeFileStream
*/
const writeFileStream = fs.createWriteStream
/**
* Node.js' [fsPromises.unlink]{@link https://nodejs.org/api/fs.html#fspromisesunlinkpath}.
* @function
* @param {string} path
* @return {Promise<void>}
* @async
* @alias module:storage.unlinkAsync
*/
const unlinkAsync = fsPromises.unlink
/**
* Node.js' [fsPromises.appendFile]{@link https://nodejs.org/api/fs.html#fspromisesappendfilepath-data-options}.
* @function
* @param {string} path
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storage.appendFileAsync
* @async
*/
const appendFileAsync = fsPromises.appendFile
/**
* Node.js' [fsPromises.readFile]{@link https://nodejs.org/api/fs.html#fspromisesreadfilepath-options}.
* @function
* @param {string} path
* @param {object} [options]
* @return {Promise<Buffer>}
* @alias module:storage.readFileAsync
* @async
*/
const readFileAsync = fsPromises.readFile
/**
* Node.js' [fs.createReadStream]{@link https://nodejs.org/api/fs.html#fscreatereadstreampath-options}.
* @function
* @param {string} path
* @param {Object} [options]
* @return {fs.ReadStream}
* @alias module:storage.readFileStream
*/
const readFileStream = fs.createReadStream
/**
* Node.js' [fsPromises.mkdir]{@link https://nodejs.org/api/fs.html#fspromisesmkdirpath-options}.
* @function
* @param {string} path
* @param {object} options
* @return {Promise<void|string>}
* @alias module:storage.mkdirAsync
* @async
*/
const mkdirAsync = fsPromises.mkdir
/**
* Removes file if it exists.
* @param {string} file
* @return {Promise<void>}
* @alias module:storage.ensureFileDoesntExistAsync
* @async
*/
const ensureFileDoesntExistAsync = async file => {
if (await existsAsync(file)) await unlinkAsync(file)
}
/**
* Flush data in OS buffer to storage if corresponding option is set.
* @param {object|string} options If options is a string, it is assumed that the flush of the file (not dir) called options was requested
* @param {string} [options.filename]
* @param {boolean} [options.isDir = false] Optional, defaults to false
* @param {number} [options.mode = 0o644] Optional, defaults to 0o644
* @return {Promise<void>}
* @alias module:storage.flushToStorageAsync
* @async
*/
const flushToStorageAsync = async (options) => {
let filename
let flags
let mode
if (typeof options === 'string') {
filename = options
flags = 'r+'
mode = DEFAULT_FILE_MODE
} else {
filename = options.filename
flags = options.isDir ? 'r' : 'r+'
mode = options.mode !== undefined ? options.mode : DEFAULT_FILE_MODE
}
/**
* Some OSes and/or storage backends (augmented node fs) do not support fsync (FlushFileBuffers) directories,
* or calling open() on directories at all. Flushing fails silently in this case, supported by following heuristics:
* + isDir === true
* |-- open(<dir>) -> (err.code === 'EISDIR'): can't call open() on directories (eg. BrowserFS)
* `-- fsync(<dir>) -> (errFS.code === 'EPERM' || errFS.code === 'EISDIR'): can't fsync directory: permissions are checked
* on open(); EPERM error should only occur on fsync incapability and not for general lack of permissions (e.g. Windows)
*
* We can live with this as it cannot cause 100% dataloss except in the very rare event of the first time
* database is loaded and a crash happens.
*/
let filehandle, errorOnFsync, errorOnClose
try {
filehandle = await fsPromises.open(filename, flags, mode)
try {
await filehandle.sync()
} catch (errFS) {
errorOnFsync = errFS
}
} catch (error) {
if (error.code !== 'EISDIR' || !options.isDir) throw error
} finally {
try {
await filehandle.close()
} catch (errC) {
errorOnClose = errC
}
}
if ((errorOnFsync || errorOnClose) && !((errorOnFsync.code === 'EPERM' || errorOnClose.code === 'EISDIR') && options.isDir)) {
const e = new Error('Failed to flush to storage')
e.errorOnFsync = errorOnFsync
e.errorOnClose = errorOnClose
throw e
}
}
/**
* Fully write or rewrite the datafile.
* @param {string} filename
* @param {string[]} lines
* @param {number} [mode=0o644]
* @return {Promise<void>}
* @alias module:storage.writeFileLinesAsync
* @async
*/
const writeFileLinesAsync = (filename, lines, mode = DEFAULT_FILE_MODE) => new Promise((resolve, reject) => {
try {
const stream = writeFileStream(filename, { mode })
const readable = Readable.from(lines)
readable.on('data', (line) => {
try {
stream.write(line + '\n')
} catch (err) {
reject(err)
}
})
readable.on('end', () => {
stream.close(err => {
if (err) reject(err)
else resolve()
})
})
readable.on('error', err => {
reject(err)
})
stream.on('error', err => {
reject(err)
})
} catch (err) {
reject(err)
}
})
/**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost).
* @param {string} filename
* @param {string[]} lines
* @param {object} [modes={ fileMode: 0o644, dirMode: 0o755 }]
* @param {number} modes.dirMode
* @param {number} modes.fileMode
* @return {Promise<void>}
* @alias module:storage.crashSafeWriteFileLinesAsync
*/
const crashSafeWriteFileLinesAsync = async (filename, lines, modes = { fileMode: DEFAULT_FILE_MODE, dirMode: DEFAULT_DIR_MODE }) => {
const tempFilename = filename + '~'
await flushToStorageAsync({ filename: path.dirname(filename), isDir: true, mode: modes.dirMode })
const exists = await existsAsync(filename)
if (exists) await flushToStorageAsync({ filename, mode: modes.fileMode })
await writeFileLinesAsync(tempFilename, lines, modes.fileMode)
await flushToStorageAsync({ filename: tempFilename, mode: modes.fileMode })
await renameAsync(tempFilename, filename)
await flushToStorageAsync({ filename: path.dirname(filename), isDir: true, mode: modes.dirMode })
}
/**
* Ensure the datafile contains all the data, even if there was a crash during a full file write.
* @param {string} filename
* @param {number} [mode=0o644]
* @return {Promise<void>}
* @alias module:storage.ensureDatafileIntegrityAsync
*/
const ensureDatafileIntegrityAsync = async (filename, mode = DEFAULT_FILE_MODE) => {
const tempFilename = filename + '~'
const filenameExists = await existsAsync(filename)
// Write was successful
if (filenameExists) return
const oldFilenameExists = await existsAsync(tempFilename)
// New database
if (!oldFilenameExists) await writeFileAsync(filename, '', { encoding: 'utf8', mode })
// Write failed, use old version
else await renameAsync(tempFilename, filename)
}
/**
* Check if a file's parent directory exists and create it on the fly if it is not the case.
* @param {string} filename
* @param {number} mode
* @return {Promise<void>}
* @private
*/
const ensureParentDirectoryExistsAsync = async (filename, mode) => {
const dir = path.dirname(filename)
const parsedDir = path.parse(path.resolve(dir))
// this is because on Windows mkdir throws a permission error when called on the root directory of a volume
if (process.platform !== 'win32' || parsedDir.dir !== parsedDir.root || parsedDir.base !== '') {
await mkdirAsync(dir, { recursive: true, mode })
}
}
// Interface
module.exports.existsAsync = existsAsync
module.exports.renameAsync = renameAsync
module.exports.writeFileAsync = writeFileAsync
module.exports.writeFileLinesAsync = writeFileLinesAsync
module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.appendFileAsync = appendFileAsync
module.exports.readFileAsync = readFileAsync
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdirAsync = mkdirAsync
module.exports.readFileStream = readFileStream
module.exports.flushToStorageAsync = flushToStorageAsync
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync
module.exports.ensureFileDoesntExistAsync = ensureFileDoesntExistAsync
module.exports.ensureParentDirectoryExistsAsync = ensureParentDirectoryExistsAsync

84
node_modules/@seald-io/nedb/lib/utils.js generated vendored Normal file
View File

@ -0,0 +1,84 @@
/**
* Utility functions for all environments.
* This replaces the underscore dependency.
*
* @module utils
* @private
*/
/**
* @callback IterateeFunction
* @param {*} arg
* @return {*}
*/
/**
* Produces a duplicate-free version of the array, using === to test object equality. In particular only the first
* occurrence of each value is kept. If you want to compute unique items based on a transformation, pass an iteratee
* function.
*
* Heavily inspired by {@link https://underscorejs.org/#uniq}.
* @param {Array} array
* @param {IterateeFunction} [iteratee] transformation applied to every element before checking for duplicates. This will not
* transform the items in the result.
* @return {Array}
* @alias module:utils.uniq
*/
const uniq = (array, iteratee) => {
if (iteratee) return [...(new Map(array.map(x => [iteratee(x), x]))).values()]
else return [...new Set(array)]
}
/**
* Returns true if arg is an Object. Note that JavaScript arrays and functions are objects, while (normal) strings
* and numbers are not.
*
* Heavily inspired by {@link https://underscorejs.org/#isObject}.
* @param {*} arg
* @return {boolean}
*/
const isObject = arg => typeof arg === 'object' && arg !== null
/**
* Returns true if d is a Date.
*
* Heavily inspired by {@link https://underscorejs.org/#isDate}.
* @param {*} d
* @return {boolean}
* @alias module:utils.isDate
*/
const isDate = d => isObject(d) && Object.prototype.toString.call(d) === '[object Date]'
/**
* Returns true if re is a RegExp.
*
* Heavily inspired by {@link https://underscorejs.org/#isRegExp}.
* @param {*} re
* @return {boolean}
* @alias module:utils.isRegExp
*/
const isRegExp = re => isObject(re) && Object.prototype.toString.call(re) === '[object RegExp]'
/**
* Return a copy of the object filtered using the given keys.
*
* @param {object} object
* @param {string[]} keys
* @return {object}
*/
const pick = (object, keys) => {
return keys.reduce((obj, key) => {
if (object && Object.prototype.hasOwnProperty.call(object, key)) {
obj[key] = object[key]
}
return obj
}, {})
}
const filterIndexNames = (indexNames) => ([k, v]) => !!(typeof v === 'string' || typeof v === 'number' || typeof v === 'boolean' || isDate(v) || v === null) &&
indexNames.includes(k)
module.exports.uniq = uniq
module.exports.isDate = isDate
module.exports.isRegExp = isRegExp
module.exports.pick = pick
module.exports.filterIndexNames = filterIndexNames

48
node_modules/@seald-io/nedb/lib/waterfall.js generated vendored Normal file
View File

@ -0,0 +1,48 @@
/**
* Responsible for sequentially executing actions on the database
* @private
*/
class Waterfall {
/**
* Instantiate a new Waterfall.
*/
constructor () {
/**
* This is the internal Promise object which resolves when all the tasks of the `Waterfall` are done.
*
* It will change any time `this.waterfall` is called.
*
* @type {Promise}
*/
this.guardian = Promise.resolve()
}
/**
*
* @param {AsyncFunction} func
* @return {AsyncFunction}
*/
waterfall (func) {
return (...args) => {
this.guardian = this.guardian.then(() => {
return func(...args)
.then(result => ({ error: false, result }), result => ({ error: true, result }))
})
return this.guardian.then(({ error, result }) => {
if (error) return Promise.reject(result)
else return Promise.resolve(result)
})
}
}
/**
* Shorthand for chaining a promise to the Waterfall
* @param {Promise} promise
* @return {Promise}
*/
chain (promise) {
return this.waterfall(() => promise)()
}
}
module.exports = Waterfall

121
node_modules/@seald-io/nedb/package.json generated vendored Executable file
View File

@ -0,0 +1,121 @@
{
"name": "@seald-io/nedb",
"version": "4.0.4",
"files": [
"lib/**/*.js",
"browser-version/**/*.js",
"index.js",
"index.d.ts"
],
"types": "index.d.ts",
"author": {
"name": "Timothée Rebours",
"email": "tim@seald.io",
"url": "https://www.seald.io/"
},
"contributors": [
{
"name": "Louis Chatriot",
"email": "louis.chatriot@gmail.com"
},
{
"name": "Timothée Rebours",
"email": "tim@seald.io",
"url": "https://www.seald.io/"
},
{
"name": "Eliot Akira",
"email": "me@eliotakira.com",
"url": "https://eliotakira.com/"
},
{
"name": " Loïc Hermann",
"email": "loic.hermann@outlook.fr"
}
],
"description": "File-based embedded data store for node.js",
"keywords": [
"database",
"datastore",
"embedded"
],
"homepage": "https://github.com/seald/nedb",
"repository": {
"type": "git",
"url": "git@github.com:seald/nedb.git"
},
"dependencies": {
"@seald-io/binary-search-tree": "^1.0.3",
"localforage": "^1.9.0",
"util": "^0.12.4"
},
"devDependencies": {
"@react-native-async-storage/async-storage": "^1.17.11",
"@types/jest": "^27.5.2",
"browser-resolve": "^2.0.0",
"chai": "^4.3.7",
"commander": "^7.2.0",
"events": "^3.3.0",
"jest": "^27.5.1",
"jsdoc-to-markdown": "^8.0.0",
"karma": "^6.4.1",
"karma-chai": "^0.1.0",
"karma-chrome-launcher": "^3.1.1",
"karma-junit-reporter": "^2.0.1",
"karma-mocha": "^2.0.1",
"karma-source-map-support": "^1.4.0",
"mocha": "^10.2.0",
"mocha-junit-reporter": "^2.2.0",
"path-browserify": "^1.0.1",
"process": "^0.11.10",
"react": "^18.2.0",
"react-native": "^0.71.2",
"semver": "^7.3.8",
"source-map-loader": "^4.0.1",
"standard": "^17.0.0",
"terser-webpack-plugin": "^5.3.6",
"timers-browserify": "^2.0.12",
"ts-jest": "^27.1.5",
"typescript": "^4.9.5",
"webpack": "^5.75.0",
"webpack-cli": "^5.0.1",
"xvfb-maybe": "^0.2.1"
},
"scripts": {
"lint": "standard",
"test": "mocha --reporter spec --timeout 10000",
"build:browser": "webpack && webpack --optimization-minimize",
"pretest:browser": "npm run build:browser",
"test:browser": "xvfb-maybe karma start karma.conf.local.js",
"test:react-native": "jest test/react-native",
"test:typings": "tsc ./typings-tests.ts",
"prepublishOnly": "npm run build:browser",
"generateDocs:markdown": "jsdoc2md --no-cache -c jsdoc.conf.js --param-list-format list --files ./lib/*.js > API.md"
},
"main": "index.js",
"browser": {
"./lib/customUtils.js": "./browser-version/lib/customUtils.js",
"./lib/storage.js": "./browser-version/lib/storage.browser.js",
"./lib/byline.js": "./browser-version/lib/byline.js"
},
"react-native": {
"./lib/customUtils.js": "./browser-version/lib/customUtils.js",
"./lib/storage.js": "./browser-version/lib/storage.react-native.js",
"./lib/byline.js": "./browser-version/lib/byline.js"
},
"license": "MIT",
"publishConfig": {
"access": "public"
},
"standard": {
"ignore": [
"browser-version/out",
"**/*.ts"
]
},
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"resolver": "<rootDir>/test/react-native/resolver.js"
}
}