Initial import with skill sheet working
This commit is contained in:
71
node_modules/@foundryvtt/foundryvtt-cli/CHANGELOG.md
generated
vendored
Normal file
71
node_modules/@foundryvtt/foundryvtt-cli/CHANGELOG.md
generated
vendored
Normal file
@ -0,0 +1,71 @@
|
||||
## 1.0.3
|
||||
|
||||
### Fixes
|
||||
- Fixed packing and unpacking ignoring Scene Regions.
|
||||
|
||||
## 1.0.2
|
||||
|
||||
### Improvements
|
||||
- Added the `--clean` command-line flag for unpack operations that will delete the destination directory before unpacking.
|
||||
- Throw an error when packing if an entry with the same `_id` would overwrite an entry that had already been packed as part of the same operation.
|
||||
|
||||
## 1.0.1
|
||||
|
||||
### Fixes
|
||||
- Fixed issue with YAML serialization.
|
||||
|
||||
## 1.0.0
|
||||
|
||||
### Improvements
|
||||
- (Ryan Walder) Appropriately detect files with `.yaml` extensions as YAML files.
|
||||
- (Ryan Walder) Added the `yamlOptions` parameter to `extractPack` to allow configuring how YAML is serialized.
|
||||
- Added the `--recursive` command-line flag for pack operations that will recurse into subdirectories when finding source files.
|
||||
- Added the `jsonOptions` parameter to `extractPack` to allow configuring how JSON is serialized.
|
||||
- Removed unused `nedb-core` dependency.
|
||||
|
||||
## 1.0.0-rc.4
|
||||
|
||||
### Fixes
|
||||
- Fixed issues with packing/unpacking as YAML.
|
||||
|
||||
## 1.0.0-rc.3
|
||||
|
||||
### Improvements
|
||||
- (phenomen) Added cyrillic characters to the set of filesystem-safe characters that a Document name can be written as.
|
||||
- Factored out `extractPack` and `compilePack` functionality into an external API.
|
||||
|
||||
## 1.0.0-rc.2
|
||||
|
||||
### Fixes
|
||||
- (DJ4ddi) Fixed path name resolution on Windows.
|
||||
- Fixed compendium name not being appropriately appended to --in or --out options depending on operation.
|
||||
|
||||
### Improvements
|
||||
- (DJ4ddi) Added minimum node engine version.
|
||||
- (DJ4ddi) Re-throw parse errors to provide stack trace output in the console.
|
||||
|
||||
## 1.0.0-rc.1
|
||||
|
||||
### Fixes
|
||||
- (DJ4ddi) Log more specific errors when an operation fails.
|
||||
- (DJ4ddi) Compress LevelDB after packing it.
|
||||
- Fixed NeDB operations throwing a TypeError.
|
||||
|
||||
### Breaking Changes
|
||||
- (DJ4ddi) Renamed `--inputDirectory` shorthand option from `--id` to `--in` to fix conflict with package ID `--id` option.
|
||||
- Renamed `--outputDirectory` shorthand option from `--od` to `--out` to better align with the above change.
|
||||
- NeDB unpack operations now write source data to the same directory as LevelDB unpack operations by default (`packs/{compendiumName}/_source`). This fixes an issue whereby sequential NeDB unpack operations would mix all their output together into the same `packs/_source` directory, and allows for better inter-operability with LevelDB operations.
|
||||
- Corresponding to the above change, NeDB pack operations by default will look for source files under `packs/{compendiumName}/_source` instead of `packs/_source`.
|
||||
- Unpack operations are now consistent between NeDB and LevelDB: Both will unpack primary Document entries to a single file, with all embedded Documents included, rather than LevelDB unpack operations writing every embedded Document to its own file.
|
||||
|
||||
### Improvements
|
||||
- Improved JSDoc annotations across the project.
|
||||
- Improved NeDB document type inference to check the manifest of the package the compendium belongs to rather than searching all packages for a compendium with a matching name.
|
||||
- The CLI should be slightly better-behaved and exit with a non-zero error code if it does encounter an error in most cases.
|
||||
- When writing JSON files, a newline character is appended to the end of the file to make it more git-friendly.
|
||||
|
||||
### Miscellaneous
|
||||
- Removed IDE-specific project data from git tracking.
|
||||
- Refactored codebase to conform with Foundry VTT code style guidelines.
|
||||
- Added .editorconfig and .eslintrc.json to enforce code style.
|
||||
- Added .npmignore to strip development-only files from final NPM package.
|
21
node_modules/@foundryvtt/foundryvtt-cli/LICENSE
generated
vendored
Normal file
21
node_modules/@foundryvtt/foundryvtt-cli/LICENSE
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2023 Foundry Virtual Tabletop
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
171
node_modules/@foundryvtt/foundryvtt-cli/README.md
generated
vendored
Normal file
171
node_modules/@foundryvtt/foundryvtt-cli/README.md
generated
vendored
Normal file
@ -0,0 +1,171 @@
|
||||
# foundryvtt-cli
|
||||
The official Foundry VTT CLI
|
||||
|
||||
## Installation
|
||||
```bash
|
||||
npm install -g @foundryvtt/foundryvtt-cli
|
||||
```
|
||||
|
||||
## Usage
|
||||
### Help
|
||||
```bash
|
||||
fvtt --help
|
||||
```
|
||||
|
||||
### Current CLI Version
|
||||
```bash
|
||||
fvtt --version
|
||||
```
|
||||
|
||||
### Configuration
|
||||
```bash
|
||||
fvtt configure
|
||||
```
|
||||
Determines if your configuration is correct and if not, prompts you to fix it.
|
||||
|
||||
#### View
|
||||
```bash
|
||||
fvtt configure view
|
||||
```
|
||||
|
||||
View your current configuration.
|
||||
|
||||
|
||||
#### Set
|
||||
```bash
|
||||
fvtt configure set "key" "value"
|
||||
```
|
||||
|
||||
Set a configuration value.
|
||||
|
||||
#### Get
|
||||
```bash
|
||||
fvtt configure get "key"
|
||||
```
|
||||
|
||||
Get a configuration value.
|
||||
|
||||
#### Path
|
||||
```bash
|
||||
fvtt configure path
|
||||
```
|
||||
Outputs the path to your configuration file.
|
||||
|
||||
### Launch
|
||||
|
||||
```bash
|
||||
fvtt launch
|
||||
```
|
||||
Launches Foundry VTT. Available options are:
|
||||
* `--demo` - Launches Foundry VTT in demo mode.
|
||||
* `--port 30000` - Launches Foundry VTT on a specific port.
|
||||
* `--world my-world` - Launches Foundry VTT straight into a specific world.
|
||||
* `--noupnp` - Disable UPnP port forwarding.
|
||||
* `--noupdate` - Disable automatic update checking.
|
||||
* `--adminKey "ABC123"` - The admin key to secure Foundry VTT's Setup screen with.
|
||||
|
||||
### Package
|
||||
|
||||
```bash
|
||||
fvtt package
|
||||
```
|
||||
Output the current working package, if any.
|
||||
|
||||
#### Workon
|
||||
```bash
|
||||
fvtt package workon "1001-fish" --type "Module"
|
||||
```
|
||||
Swaps to working on a specific package, eliminating the need to pass `--type` and `--id` to other package commands.
|
||||
|
||||
#### Clear
|
||||
```bash
|
||||
fvtt package clear
|
||||
```
|
||||
Clears the current working package.
|
||||
|
||||
#### Unpack
|
||||
```bash
|
||||
fvtt package unpack "compendiumName"
|
||||
```
|
||||
Reads a database from the current Package /packs/ directory and writes each document as a serialized Object to its own file.
|
||||
There are a number of options available to customize the output, check out `fvtt package unpack --help` for more information.
|
||||
|
||||
#### Pack
|
||||
```bash
|
||||
fvtt package pack "compendiumName"
|
||||
```
|
||||
|
||||
Reads a directory of serialized Objects and writes them to a database in the current Package /packs/ directory. There are a number of options available to customize the operation, check out `fvtt package pack --help` for more information.
|
||||
|
||||
## Example Workflow
|
||||
|
||||
```bash
|
||||
fvtt configure
|
||||
fvtt package workon "1001-fish"
|
||||
fvtt package unpack "fish"
|
||||
. . . // Make some updates to the files
|
||||
fvtt package pack "fish"
|
||||
```
|
||||
|
||||
## Development
|
||||
```bash
|
||||
git clone
|
||||
cd foundryvtt-cli
|
||||
npm install
|
||||
npm run build
|
||||
npm link
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
Certain internal functionality of the CLI is exposed as an API that can be imported into other projects.
|
||||
|
||||
### Example Usage
|
||||
|
||||
```js
|
||||
import { compilePack, extractPack } from "@foundryvtt/foundryvtt-cli";
|
||||
|
||||
// Extract a NeDB compendium pack.
|
||||
await extractpack("mymodule/packs/actors.db", "mymodule/packs/src/actors", { nedb: true });
|
||||
|
||||
// Compile a LevelDB compendium pack.
|
||||
await compilePack("mymodule/packs/src/actors", "mymodule/packs/actors");
|
||||
```
|
||||
|
||||
### `compilePack(src: string, dest: string, options?: object): Promise<void>`
|
||||
|
||||
Compile source files into a compendium pack.
|
||||
|
||||
#### Parameters
|
||||
|
||||
* **src:** *string* The directory containing the source files.
|
||||
* **dest:** *string* The target compendium pack.
|
||||
* **options:** *object*
|
||||
* **nedb:** *boolean = false* Whether to operate on a NeDB database, otherwise a LevelDB database is assumed.
|
||||
* **yaml:** *boolean = false* Whether the source files are in YAML format, otherwise JSON is assumed.
|
||||
* **log:** *boolean = false* Whether to log operation progress to the console.
|
||||
* **recursive:** *boolean = false* Whether to recurse into child directories under **src**, otherwise only source files located directly under **src** will be used.
|
||||
* **transformEntry:** *(entry: object): Promise<false|void>* A function that is called on every entry. Returning *false* indicates that the entry should be discarded.
|
||||
|
||||
### `extractPack(src: string, dest: string, options?: object): Promise<void>`
|
||||
|
||||
Extract the contents of a compendium pack into individual source files for each primary Document.
|
||||
|
||||
#### Parameters
|
||||
|
||||
* **src:** *string* The source compendium pack.
|
||||
* **dest:** *string* The directory to write the extracted files into.
|
||||
* **options:** *object*
|
||||
* **nedb:** *boolean = false* Whether to operate on a NeDB database, otherwise a LevelDB database is assumed.
|
||||
* **yaml:** *boolean = false* Whether the source files are in YAML format, otherwise JSON is assumed.
|
||||
* **yamlOptions** *object = {}* Options to pass to `yaml.dump` when serializing Documents.
|
||||
* **log:** *boolean = false* Whether to log operation progress to the console.
|
||||
* **documentType:** *string* For NeDB operations, a **documentType** must be provided. This should be the same as the pack's *type* field in the *module.json* or *system.json*.
|
||||
* **transformEntry:** *(entry: object): Promise<false|void>* A function that is called on every entry. Returning *false* indicates that the entry should be discarded.
|
||||
* **transformName:** *(entry: object): Promise<string|void>* A function that is called on every entry. The value returned from this will be used as the entry's filename and must include the appropriate file extension. If nothing is returned, an auto-generated name will be used instead.
|
||||
* **jsonOptions:** *object*
|
||||
* **replacer:** *(key: string, value: any): any|Array<string|number>* A replacer function or an array of property names in the object to include in the resulting string.
|
||||
* **space:** *string|number* A number of spaces or a string to use as indentation.
|
||||
|
||||
## Contributing
|
||||
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
|
72
node_modules/@foundryvtt/foundryvtt-cli/commands/configuration.mjs
generated
vendored
Normal file
72
node_modules/@foundryvtt/foundryvtt-cli/commands/configuration.mjs
generated
vendored
Normal file
@ -0,0 +1,72 @@
|
||||
import Config from "../config.mjs";
|
||||
|
||||
/**
|
||||
* Get the command object for the configuration command
|
||||
* @returns {CommandModule}
|
||||
*/
|
||||
export function getCommand() {
|
||||
return {
|
||||
command: "configure [action] [key] [value]",
|
||||
describe: "Manage configuration",
|
||||
builder: yargs => {
|
||||
yargs.positional("action", {
|
||||
describe: "The action to perform",
|
||||
type: "string",
|
||||
choices: ["get", "set", "path", "view"]
|
||||
}).positional("key", {
|
||||
describe: "The configuration key",
|
||||
type: "string"
|
||||
}).positional("value", {
|
||||
describe: "The configuration value",
|
||||
type: "string"
|
||||
});
|
||||
return yargs;
|
||||
},
|
||||
handler: async argv => {
|
||||
|
||||
// Handle actions
|
||||
switch ( argv.action ) {
|
||||
case "get": {
|
||||
console.log(Config.instance.get(argv.key));
|
||||
break;
|
||||
}
|
||||
|
||||
case "set": {
|
||||
Config.instance.set(argv.key, argv.value);
|
||||
console.log(`Set ${argv.key} to ${argv.value}`);
|
||||
break;
|
||||
}
|
||||
|
||||
case "view": {
|
||||
// Output the current configuration
|
||||
console.log("Current Configuration:", Config.instance.getAll());
|
||||
break;
|
||||
}
|
||||
|
||||
case "path": {
|
||||
// Output the current configuration file path
|
||||
console.log("Current Configuration File:", Config.instance.configPath);
|
||||
break;
|
||||
}
|
||||
|
||||
default: {
|
||||
// Determine if the dataPath and installPath are set
|
||||
const installPath = Config.instance.get("installPath");
|
||||
if ( !installPath ) {
|
||||
console.error("The installation path is not set. Use `configure set installPath <path>` to set it. "
|
||||
+ "Install paths look like `C:/Program Files/Foundry Virtual Tabletop`");
|
||||
}
|
||||
|
||||
const dataPath = Config.instance.get("dataPath");
|
||||
if ( !dataPath ) {
|
||||
console.error("The data path is not set. Use `configure set dataPath <path>` to set it. "
|
||||
+ "Data paths look like `C:/Users/Example/AppData/Local/FoundryVTT`");
|
||||
}
|
||||
|
||||
// If both are set, configuration is complete
|
||||
if ( installPath && dataPath ) console.log("Configuration complete!");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
91
node_modules/@foundryvtt/foundryvtt-cli/commands/launch.mjs
generated
vendored
Normal file
91
node_modules/@foundryvtt/foundryvtt-cli/commands/launch.mjs
generated
vendored
Normal file
@ -0,0 +1,91 @@
|
||||
import Config from "../config.mjs";
|
||||
import { spawn } from "child_process";
|
||||
import path from "path";
|
||||
|
||||
/**
|
||||
* Get the command object for the launch command
|
||||
* @returns {CommandModule}
|
||||
*/
|
||||
export function getCommand() {
|
||||
return {
|
||||
command: "launch",
|
||||
describe: "Launch Foundry VTT",
|
||||
builder: yargs => {
|
||||
yargs.option("demo", {
|
||||
describe: "Launch Foundry VTT in demo mode",
|
||||
type: "boolean",
|
||||
default: false
|
||||
});
|
||||
|
||||
yargs.option("port", {
|
||||
describe: "The port to launch Foundry VTT on",
|
||||
type: "number",
|
||||
default: 30000
|
||||
});
|
||||
|
||||
yargs.option("world", {
|
||||
describe: "The world to launch Foundry VTT with",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
yargs.option("noupnp", {
|
||||
describe: "Disable UPnP port forwarding",
|
||||
type: "boolean",
|
||||
default: false
|
||||
});
|
||||
|
||||
yargs.option("noupdate", {
|
||||
describe: "Disable automatic update checking",
|
||||
type: "boolean",
|
||||
default: false
|
||||
});
|
||||
|
||||
yargs.option("adminKey", {
|
||||
describe: "The admin key to secure Foundry VTT's Setup screen with",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
return yargs;
|
||||
},
|
||||
handler: async argv => {
|
||||
|
||||
// Run the command node main.js --debug --port 30000
|
||||
// Launch Foundry VTT in debug mode on port 30000
|
||||
const { demo, port, world, noupnp, noupdate, adminKey } = argv;
|
||||
|
||||
// Determine the installation path
|
||||
const installPath = Config.instance.get("installPath");
|
||||
if ( !installPath ) {
|
||||
console.error("The installation path is not set. Use `configure set installPath <path>` to set it. "
|
||||
+ "Install paths look like `C:/Program Files/Foundry Virtual Tabletop`");
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
// Determine the data path
|
||||
const dataPath = Config.instance.get("dataPath");
|
||||
if ( !dataPath ) {
|
||||
console.error("The data path is not set. Use `configure set dataPath <path>` to set it. "
|
||||
+ "Data paths look like `C:/Users/Example/AppData/Local/FoundryVTT/Data`");
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
// Launch Foundry VTT
|
||||
const foundry = spawn("node", [
|
||||
path.normalize(path.join(installPath, "resources", "app", "main.js")),
|
||||
`--dataPath=${dataPath}`,
|
||||
`--port=${port}`,
|
||||
demo ? "--demo" : "",
|
||||
world ? `--world=${world}` : "",
|
||||
noupnp ? "--noupnp" : "",
|
||||
noupdate ? "--noupdate" : "",
|
||||
adminKey ? `--adminKey=${adminKey}` : ""
|
||||
]);
|
||||
|
||||
foundry.stdout.on("data", data => console.log(data.toString()));
|
||||
foundry.stderr.on("data", data => console.error(data.toString()));
|
||||
foundry.on("close", code => console.log(`Foundry VTT exited with code ${code}`));
|
||||
}
|
||||
}
|
||||
}
|
438
node_modules/@foundryvtt/foundryvtt-cli/commands/package.mjs
generated
vendored
Normal file
438
node_modules/@foundryvtt/foundryvtt-cli/commands/package.mjs
generated
vendored
Normal file
@ -0,0 +1,438 @@
|
||||
import Config from "../config.mjs";
|
||||
import path from "path";
|
||||
import fs from "fs";
|
||||
import chalk from "chalk";
|
||||
import { compilePack, extractPack, TYPE_COLLECTION_MAP } from "../lib/package.mjs";
|
||||
|
||||
/**
|
||||
* @typedef {"Module"|"System"|"World"} PackageType
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {object} CLIArgs
|
||||
* @property {"workon"|"clear"|"unpack"|"pack"} action The action to perform.
|
||||
* @property {string} value The action value.
|
||||
* @property {string} [id] Optionally provide the package ID if we are using explicit
|
||||
* paths.
|
||||
* @property {PackageType} type The package type.
|
||||
* @property {string} [compendiumName] The compendium name for pack-based actions.
|
||||
* @property {DocumentType} [compendiumType] The type of Documents that the compendium houses. Only required
|
||||
* for NeDB operations.
|
||||
* @property {string} [inputDirectory] An explicit input directory for pack-based actions.
|
||||
* @property {string} [outputDirectory] An explicit output directory for pack-based actions.
|
||||
* @property {boolean} [yaml] Whether to use YAML instead of JSON for serialization.
|
||||
* @property {boolean} [verbose] Whether to output verbose logging.
|
||||
* @property {boolean} [nedb] Use NeDB instead of ClassicLevel for database operations.
|
||||
* @property {boolean} [recursive] When packing, recurse down through all directories in the input
|
||||
* directory to find source files.
|
||||
* @property {boolean} [clean] When unpacking, delete the destination directory first.
|
||||
*/
|
||||
|
||||
/**
|
||||
* The working package ID.
|
||||
* @type {string|null}
|
||||
*/
|
||||
let currentPackageId = Config.instance.get("currentPackageId");
|
||||
|
||||
/**
|
||||
* The working package type.
|
||||
* @type {PackageType|null}
|
||||
*/
|
||||
let currentPackageType = Config.instance.get("currentPackageType");
|
||||
|
||||
/**
|
||||
* Get the command object for the package command
|
||||
* @returns {CommandModule}
|
||||
*/
|
||||
export function getCommand() {
|
||||
return {
|
||||
command: "package [action] [value]",
|
||||
describe: "Manage packages",
|
||||
builder: yargs => {
|
||||
yargs.positional("action", {
|
||||
describe: "The action to perform",
|
||||
type: "string",
|
||||
choices: ["workon", "clear", "unpack", "pack"]
|
||||
});
|
||||
|
||||
yargs.positional("value", {
|
||||
describe: "The value to use for the action",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
// currentPackageId is only needed if the data path has to be built with it.
|
||||
yargs.option("id", {
|
||||
describe: "The package ID",
|
||||
type: "string",
|
||||
});
|
||||
|
||||
yargs.option("type", {
|
||||
describe: "The package type",
|
||||
type: "string",
|
||||
choices: ["Module", "System", "World"]
|
||||
});
|
||||
|
||||
yargs.option("compendiumName", {
|
||||
alias: "n",
|
||||
describe: "The Compendium name, for Compendium Pack based Actions.",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
yargs.option("compendiumType", {
|
||||
alias: "t",
|
||||
describe: "The type of document that the compendium pack stores. Only necessary for NeDB operations.",
|
||||
type: "string",
|
||||
choices: Object.keys(TYPE_COLLECTION_MAP)
|
||||
});
|
||||
|
||||
yargs.option("inputDirectory", {
|
||||
alias: "in",
|
||||
describe: "The directory to read from, for Pack based Actions.",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
yargs.option("outputDirectory", {
|
||||
alias: "out",
|
||||
describe: "The directory to write to, for Pack based Actions.",
|
||||
type: "string"
|
||||
});
|
||||
|
||||
yargs.option("yaml", {
|
||||
describe: "Whether to use YAML instead of JSON for serialization.",
|
||||
type: "boolean"
|
||||
});
|
||||
|
||||
yargs.option("verbose", {
|
||||
alias: "v",
|
||||
describe: "Whether to output verbose logging.",
|
||||
type: "boolean"
|
||||
});
|
||||
|
||||
yargs.option("nedb", {
|
||||
describe: "Whether to use NeDB instead of ClassicLevel for database operations.",
|
||||
type: "boolean"
|
||||
});
|
||||
|
||||
yargs.option("recursive", {
|
||||
alias: "r",
|
||||
describe: "When packing, recurse down through all directories in the input directory to find source files.",
|
||||
type: "boolean"
|
||||
});
|
||||
|
||||
yargs.option("clean", {
|
||||
alias: "c",
|
||||
describe: "When unpacking, delete the destination directory first.",
|
||||
type: "boolean"
|
||||
});
|
||||
|
||||
return yargs;
|
||||
},
|
||||
handler: async argv => {
|
||||
if ( argv.id ) currentPackageId = argv.id;
|
||||
if ( argv.type ) currentPackageType = argv.type;
|
||||
|
||||
// Handle actions
|
||||
switch ( argv.action ) {
|
||||
case "workon": handleWorkon(argv); break;
|
||||
case "clear": handleClear(); break;
|
||||
case "unpack": await handleUnpack(argv); break;
|
||||
case "pack": await handlePack(argv); break;
|
||||
|
||||
default:
|
||||
if ( !currentPackageId ) {
|
||||
console.error(chalk.red("No package ID is currently set. Use `package workon <id>` to set it."));
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Currently in ${chalk.magenta(currentPackageType)} ${chalk.cyan(currentPackageId)}`);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Helpers */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Read all the package manifests for a given package type, and return a Map of them, keyed by ID.
|
||||
* @param {string} dataPath The root data path.
|
||||
* @param {PackageType} type The package type.
|
||||
* @param {object} [options]
|
||||
* @param {boolean} [options.verbose=false] Log errors verbosely.
|
||||
* @returns {Map<string, object>}
|
||||
*/
|
||||
function readPackageManifests(dataPath, type, { verbose=false }={}) {
|
||||
const typeLC = type.toLowerCase();
|
||||
const typePl = `${typeLC}s`;
|
||||
const dir = `${dataPath}/Data/${typePl}`;
|
||||
const map = new Map();
|
||||
|
||||
for ( const file of fs.readdirSync(dir, { withFileTypes: true }) ) {
|
||||
const manifestPath = path.normalize(`${dir}/${file.name}/${typeLC}.json`);
|
||||
try {
|
||||
const data = JSON.parse(fs.readFileSync(manifestPath, "utf8"));
|
||||
data.type = type;
|
||||
data.path = manifestPath;
|
||||
map.set(data.id ?? data.name, data);
|
||||
} catch ( err ) {
|
||||
if ( verbose ) console.error(chalk.red(`Error reading ${typeLC}.json for ${chalk.blue(file.name)}: ${err}`));
|
||||
}
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* @typedef {object} DiscoveredPackages
|
||||
* @property {Map<string, object>} modules A map of module manifest data by ID.
|
||||
* @property {Map<string, object>} systems A map of system manifest data by ID.
|
||||
* @property {Map<string, object>} worlds A map of world manifest data by ID.
|
||||
* @property {object[]} packages A list of all packages in the data path.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Discover the list of all Packages in the dataPath
|
||||
* @param {CLIArgs} argv The command line arguments
|
||||
* @returns {DiscoveredPackages|void}
|
||||
*/
|
||||
function discoverPackageDirectory(argv) {
|
||||
const dataPath = Config.instance.get("dataPath");
|
||||
if ( !dataPath ) {
|
||||
console.error(chalk.red(`No dataPath configured. Call ${chalk.yellow("`configure set dataPath <path>`")} first.`));
|
||||
return;
|
||||
}
|
||||
const modules = readPackageManifests(dataPath, "Module", { verbose: argv.verbose });
|
||||
const systems = readPackageManifests(dataPath, "System", { verbose: argv.verbose });
|
||||
const worlds = readPackageManifests(dataPath, "World", { verbose: argv.verbose });
|
||||
return { modules, systems, worlds, packages: [...modules.values(), ...systems.values(), ...worlds.values()] };
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Determine the document type of an NeDB database from the command-line arguments, if specified, or from the database's
|
||||
* containing package.
|
||||
* @param {string} packFile The path to the NeDB database.
|
||||
* @param {CLIArgs} argv The command-line arguments.
|
||||
* @returns {string|void} The document type of the NeDB database if it could be determined.
|
||||
*/
|
||||
function determineDocumentType(packFile, argv) {
|
||||
// Case 1 - The document type has been explicitly provided.
|
||||
if ( argv.compendiumType ) return argv.compendiumType;
|
||||
|
||||
// Case 2 - The type can be inferred from the pack name.
|
||||
const packName = path.basename(packFile, ".db");
|
||||
for ( const [type, collection] of Object.entries(TYPE_COLLECTION_MAP) ) {
|
||||
if ( packName === collection ) return type;
|
||||
}
|
||||
|
||||
// Case 3 - Attempt to locate this pack's metadata in the manifest of the package that owns it.
|
||||
const game = discoverPackageDirectory(argv);
|
||||
const pkg = game.packages.find(p => packFile.startsWith(path.dirname(p.path)));
|
||||
const pack = pkg?.packs.find(pack => path.resolve(path.dirname(pkg.path), pack.path) === packFile);
|
||||
if ( !pack ) {
|
||||
console.error(`Unable to determine document type for NeDB compendium at '${packFile}'. `
|
||||
+ "Set this manually with -t <type>.");
|
||||
return;
|
||||
}
|
||||
return pack.type ?? pack.entity;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Determines whether a file is locked by another process
|
||||
* @param {string} filepath The file path to test.
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function isFileLocked(filepath) {
|
||||
try {
|
||||
// Try to open the file with the 'w' flag, which requests write access
|
||||
const fd = fs.openSync(filepath, 'w');
|
||||
|
||||
// If the file was successfully opened, it is not locked
|
||||
fs.closeSync(fd);
|
||||
return false;
|
||||
} catch ( err ) {
|
||||
if ( err.code === "EBUSY" ) return true; // If the file could not be opened, it is locked
|
||||
else if ( err.code === "ENOENT" ) return false; // If the file can't be found it's not locked
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* @typedef {object} OperationPaths
|
||||
* @property {string} source The source data files.
|
||||
* @property {string} pack The path to the compendium pack.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Determine compendium pack and source data paths based on the current configuration or command-line arguments.
|
||||
* @param {CLIArgs} argv The command-line arguments.
|
||||
* @param {"pack"|"unpack"} operation The operation.
|
||||
* @returns {OperationPaths|{}} The paths required for the operation, or nothing if they could not be determined.
|
||||
*/
|
||||
function determinePaths(argv, operation) {
|
||||
const usingDefault = !argv.outputDirectory || !argv.inputDirectory;
|
||||
if ( usingDefault && (!currentPackageId || !currentPackageType) ) {
|
||||
console.error("Package ID or type could not be determined. Use `package workon <id>` to set it.");
|
||||
return {};
|
||||
}
|
||||
|
||||
const dataPath = Config.instance.get("dataPath");
|
||||
if ( usingDefault && !dataPath ) {
|
||||
console.error("No dataPath configured. Use `configure set dataPath <path>` to set it.");
|
||||
return {};
|
||||
}
|
||||
|
||||
const typeDir = `${currentPackageType.toLowerCase()}s`;
|
||||
const compendiumName = argv.compendiumName ?? argv.value;
|
||||
if ( !compendiumName ) {
|
||||
console.error("No compendium name provided. Use `-n <name>` to supply it.");
|
||||
return {};
|
||||
}
|
||||
|
||||
let pack = operation === "pack" ? argv.outputDirectory : argv.inputDirectory;
|
||||
let source = operation === "pack" ? argv.inputDirectory : argv.outputDirectory;
|
||||
if ( pack ) pack = path.join(pack, compendiumName);
|
||||
else pack = path.join(dataPath, "Data", typeDir, currentPackageId, "packs", compendiumName);
|
||||
source ??= path.join(pack, "_source");
|
||||
if ( argv.nedb ) pack += ".db";
|
||||
return { source: path.resolve(path.normalize(source)), pack: path.resolve(path.normalize(pack)) };
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Workon */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Set the current package ID and type
|
||||
* @param {CLIArgs} argv The command line arguments
|
||||
*/
|
||||
function handleWorkon(argv) {
|
||||
if ( argv.value ) currentPackageId = argv.value;
|
||||
Config.instance.set("currentPackageId", currentPackageId);
|
||||
|
||||
// Attempt to automatically determine the package type.
|
||||
if ( !argv.type ) {
|
||||
const game = discoverPackageDirectory(argv);
|
||||
const pkgCount = game.packages.filter(p => p.id === currentPackageId).length;
|
||||
if ( pkgCount > 1 ) {
|
||||
console.error(chalk.red(`Multiple packages with ID ${chalk.cyan(currentPackageId)} found. `
|
||||
+ `Please specify the package type with ${chalk.yellow("--type")}`));
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
const pkg = game.worlds.get(currentPackageId)
|
||||
?? game.systems.get(currentPackageId)
|
||||
?? game.modules.get(currentPackageId);
|
||||
if ( !pkg ) {
|
||||
console.error(chalk.red(`No package with ID ${chalk.cyan(currentPackageId)} found.`));
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
currentPackageType = pkg.type;
|
||||
}
|
||||
|
||||
Config.instance.set("currentPackageType", currentPackageType);
|
||||
console.log(`Swapped to ${chalk.magenta(currentPackageType)} ${chalk.cyan(currentPackageId)}`);
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Clear the current package ID and type
|
||||
*/
|
||||
function handleClear() {
|
||||
currentPackageId = null;
|
||||
currentPackageType = null;
|
||||
Config.instance.set("currentPackageId", currentPackageId);
|
||||
Config.instance.set("currentPackageType", currentPackageType);
|
||||
console.log("Cleared current Package");
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Unpacking */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Load a compendium pack and serialize the DB entries, each to their own file
|
||||
* @param {CLIArgs} argv The command line arguments
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function handleUnpack(argv) {
|
||||
const { source, pack } = determinePaths(argv, "unpack");
|
||||
if ( !source || !pack ) {
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
let documentType;
|
||||
const { nedb, yaml, clean } = argv;
|
||||
if ( nedb ) {
|
||||
documentType = determineDocumentType(pack, argv);
|
||||
if ( !documentType ) {
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if ( !nedb && isFileLocked(path.join(pack, "LOCK")) ) {
|
||||
console.error(chalk.red(`The pack "${chalk.blue(pack)}" is currently in use by Foundry VTT. `
|
||||
+ "Please close Foundry VTT and try again."));
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const dbMode = nedb ? "nedb" : "classic-level";
|
||||
console.log(`[${dbMode}] Unpacking "${chalk.blue(pack)}" to "${chalk.blue(source)}"`);
|
||||
|
||||
try {
|
||||
await extractPack(pack, source, { nedb, yaml, documentType, clean, log: true });
|
||||
} catch ( err ) {
|
||||
console.error(err);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Packing */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Read serialized files from a directory and write them to a compendium pack.
|
||||
* @param {CLIArgs} argv The command line arguments
|
||||
* @returns {Promise<void>}
|
||||
* @private
|
||||
*/
|
||||
async function handlePack(argv) {
|
||||
const { source, pack } = determinePaths(argv, "pack");
|
||||
if ( !source || !pack ) {
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const { nedb, yaml, recursive } = argv;
|
||||
if ( !nedb && isFileLocked(path.join(pack, "LOCK")) ) {
|
||||
console.error(chalk.red(`The pack "${chalk.blue(pack)}" is currently in use by Foundry VTT. `
|
||||
+ "Please close Foundry VTT and try again."));
|
||||
process.exitCode = 1;
|
||||
return;
|
||||
}
|
||||
|
||||
const dbMode = nedb ? "nedb" : "classic-level";
|
||||
console.log(`[${dbMode}] Packing "${chalk.blue(source)}" into "${chalk.blue(pack)}"`);
|
||||
|
||||
try {
|
||||
await compilePack(source, pack, { nedb, yaml, recursive, log: true });
|
||||
} catch ( err ) {
|
||||
console.error(err);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
}
|
4
node_modules/@foundryvtt/foundryvtt-cli/config.example.yml
generated
vendored
Normal file
4
node_modules/@foundryvtt/foundryvtt-cli/config.example.yml
generated
vendored
Normal file
@ -0,0 +1,4 @@
|
||||
installPath: C:\Program Files\Foundry Virtual Tabletop
|
||||
dataPath: C:\Users\Example\AppData\Local\FoundryVTT\Data
|
||||
currentPackageId: example-id
|
||||
currentPackageType: World
|
107
node_modules/@foundryvtt/foundryvtt-cli/config.mjs
generated
vendored
Normal file
107
node_modules/@foundryvtt/foundryvtt-cli/config.mjs
generated
vendored
Normal file
@ -0,0 +1,107 @@
|
||||
import fs from "fs";
|
||||
import yaml from "js-yaml";
|
||||
import path from "path";
|
||||
import * as os from "os";
|
||||
|
||||
/**
|
||||
* Manages the configuration of the CLI. Stored as config.yml
|
||||
*/
|
||||
export default class Config {
|
||||
|
||||
/**
|
||||
* The singleton instance.
|
||||
* @type {Config|null}
|
||||
*/
|
||||
static #instance = null;
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Get the singleton instance of the Config class
|
||||
* @returns {Config}
|
||||
*/
|
||||
static get instance() {
|
||||
if ( !this.#instance ) this.#instance = new Config();
|
||||
return this.#instance;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
constructor() {
|
||||
|
||||
// Set the config file path to the appData directory
|
||||
let basePath = os.homedir();
|
||||
switch ( process.platform ) {
|
||||
case "win32": basePath = process.env.APPDATA || path.join(basePath, "AppData", "Roaming"); break;
|
||||
case "darwin": basePath = path.join(basePath, "Library", "Preferences"); break;
|
||||
case "linux": basePath = process.env.XDG_DATA_HOME || path.join(basePath, ".local", "share"); break;
|
||||
}
|
||||
|
||||
fs.mkdirSync(basePath, { recursive: true });
|
||||
this.configPath = path.join(basePath, ".fvttrc.yml");
|
||||
|
||||
// Ensure the config file exists
|
||||
if ( !fs.existsSync(this.configPath) ) fs.writeFileSync(this.configPath, yaml.dump({}));
|
||||
this.#config = yaml.load(fs.readFileSync(this.configPath, "utf8"));
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* The configuration data.
|
||||
* @type {Record<string, any>}
|
||||
*/
|
||||
#config = {};
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* The path to the configuration file.
|
||||
* @type {string}
|
||||
*/
|
||||
configPath = "";
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Get the entire configuration object
|
||||
* @returns {Record<string, any>}
|
||||
*/
|
||||
getAll() {
|
||||
return this.#config;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Get a specific configuration value
|
||||
* @param {string} key The configuration key
|
||||
* @returns {any}
|
||||
*/
|
||||
get(key) {
|
||||
return this.#config[key];
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Set a specific configuration value
|
||||
* @param {string} key The configuration key
|
||||
* @param {any} value The configuration value
|
||||
*/
|
||||
set(key, value) {
|
||||
this.#config[key] = value;
|
||||
|
||||
// Write to disk
|
||||
this.#writeConfig();
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Write the configuration to disk
|
||||
*/
|
||||
#writeConfig() {
|
||||
fs.writeFileSync(this.configPath, yaml.dump(this.#config));
|
||||
}
|
||||
}
|
15
node_modules/@foundryvtt/foundryvtt-cli/fvtt.mjs
generated
vendored
Executable file
15
node_modules/@foundryvtt/foundryvtt-cli/fvtt.mjs
generated
vendored
Executable file
@ -0,0 +1,15 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
import yargs from "yargs";
|
||||
import { hideBin } from "yargs/helpers";
|
||||
import { getCommand as configureCommand } from "./commands/configuration.mjs";
|
||||
import { getCommand as packageCommand } from "./commands/package.mjs";
|
||||
import { getCommand as launchCommand } from "./commands/launch.mjs";
|
||||
|
||||
const argv = yargs(hideBin(process.argv))
|
||||
.usage("Usage: $0 <command> [options]")
|
||||
.command(configureCommand())
|
||||
.command(packageCommand())
|
||||
.command(launchCommand())
|
||||
.help().alias("help", "h")
|
||||
.argv;
|
1
node_modules/@foundryvtt/foundryvtt-cli/index.mjs
generated
vendored
Normal file
1
node_modules/@foundryvtt/foundryvtt-cli/index.mjs
generated
vendored
Normal file
@ -0,0 +1 @@
|
||||
export { compilePack, extractPack } from "./lib/package.mjs";
|
552
node_modules/@foundryvtt/foundryvtt-cli/lib/package.mjs
generated
vendored
Normal file
552
node_modules/@foundryvtt/foundryvtt-cli/lib/package.mjs
generated
vendored
Normal file
@ -0,0 +1,552 @@
|
||||
import fs from "fs";
|
||||
import path from "path";
|
||||
import Datastore from "nedb-promises";
|
||||
import chalk from "chalk";
|
||||
import { default as YAML } from "js-yaml";
|
||||
import { ClassicLevel } from "classic-level";
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Configuration */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* @typedef {
|
||||
* "Actor"|"Adventure"|"Cards"|"ChatMessage"|"Combat"|"FogExploration"|"Folder"|"Item"|"JournalEntry"|"Macro"|
|
||||
* "Playlist"|"RollTable"|"Scene"|"Setting"|"User"
|
||||
* } DocumentType
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {
|
||||
* "actors"|"adventures"|"cards"|"messages"|"combats"|"fog"|"folders"|"items"|"journal"|"macros"|"playlists"|"tables"|
|
||||
* "scenes"|"settings"|"users"
|
||||
* } DocumentCollection
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {object} PackageOptions
|
||||
* @property {boolean} [nedb=false] Whether to operate on a NeDB database, otherwise a LevelDB database is
|
||||
* assumed.
|
||||
* @property {boolean} [yaml=false] Whether the source files are in YAML format, otherwise JSON is
|
||||
* assumed.
|
||||
* @property {boolean} [log=false] Whether to log operation progress to the console.
|
||||
* @property {EntryTransformer} [transformEntry] A function that is called on every entry to transform it.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {PackageOptions} CompileOptions
|
||||
* @property {boolean} [recursive=false] Whether to recurse into child directories to locate source files, otherwise
|
||||
* only source files located in the root directory will be used.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {PackageOptions} ExtractOptions
|
||||
* @property {object} [yamlOptions] Options to pass to yaml.dump when serializing Documents.
|
||||
* @property {JSONOptions} [jsonOptions] Options to pass to JSON.stringify when serializing Documents.
|
||||
* @property {DocumentType} [documentType] Required only for NeDB packs in order to generate a correct key.
|
||||
* @property {boolean} [clean] Delete the destination directory before unpacking.
|
||||
* @property {DocumentCollection} [collection] Required only for NeDB packs in order to generate a correct key. Can be
|
||||
* used instead of documentType if known.
|
||||
* @property {NameTransformer} [transformName] A function that is used to generate a filename for the extracted
|
||||
* Document. If used, the generated name must include the appropriate file
|
||||
* extension. The generated name will be resolved against the root path
|
||||
* provided to the operation, and the entry will be written to that
|
||||
* resolved location.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {object} JSONOptions
|
||||
* @property {JSONReplacer|Array<string|number>} [replacer] A replacer function or an array of property names in the
|
||||
* object to include in the resulting string.
|
||||
* @property {string|number} [space] A number of spaces or a string to use as indentation.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback JSONReplacer
|
||||
* @param {string} key The key being stringified.
|
||||
* @param {any} value The value being stringified.
|
||||
* @returns {any} The value returned is substituted instead of the current property's value.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback EntryTransformer
|
||||
* @param {object} entry The entry data.
|
||||
* @returns {Promise<false|void>} Return boolean false to indicate that this entry should be discarded.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback NameTransformer
|
||||
* @param {object} entry The entry data.
|
||||
* @returns {Promise<string|void>} If a string is returned, it is used as the filename that the entry will be written
|
||||
* to.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback HierarchyApplyCallback
|
||||
* @param {object} doc The Document being operated on.
|
||||
* @param {string} collection The Document's collection.
|
||||
* @param {object} [options] Additional options supplied by the invocation on the level above this one.
|
||||
* @returns {Promise<object|void>} Options to supply to the next level of the hierarchy.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @callback HierarchyMapCallback
|
||||
* @param {any} entry The element stored in the collection.
|
||||
* @param {string} collection The collection name.
|
||||
* @returns {Promise<any>}
|
||||
*/
|
||||
|
||||
/**
|
||||
* A flattened view of the Document hierarchy. The type of the value determines what type of collection it is. Arrays
|
||||
* represent embedded collections, while objects represent embedded documents.
|
||||
* @type {Record<string, Record<string, object|Array>>}
|
||||
*/
|
||||
const HIERARCHY = {
|
||||
actors: {
|
||||
items: [],
|
||||
effects: []
|
||||
},
|
||||
cards: {
|
||||
cards: []
|
||||
},
|
||||
combats: {
|
||||
combatants: []
|
||||
},
|
||||
delta: {
|
||||
items: [],
|
||||
effects: []
|
||||
},
|
||||
items: {
|
||||
effects: []
|
||||
},
|
||||
journal: {
|
||||
pages: []
|
||||
},
|
||||
playlists: {
|
||||
sounds: []
|
||||
},
|
||||
regions: {
|
||||
behaviors: []
|
||||
},
|
||||
tables: {
|
||||
results: []
|
||||
},
|
||||
tokens: {
|
||||
delta: {}
|
||||
},
|
||||
scenes: {
|
||||
drawings: [],
|
||||
tokens: [],
|
||||
lights: [],
|
||||
notes: [],
|
||||
regions: [],
|
||||
sounds: [],
|
||||
templates: [],
|
||||
tiles: [],
|
||||
walls: []
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* A mapping of primary document types to collection names.
|
||||
* @type {Record<DocumentType, DocumentCollection>}
|
||||
*/
|
||||
export const TYPE_COLLECTION_MAP = {
|
||||
Actor: "actors",
|
||||
Adventure: "adventures",
|
||||
Cards: "cards",
|
||||
ChatMessage: "messages",
|
||||
Combat: "combats",
|
||||
FogExploration: "fog",
|
||||
Folder: "folders",
|
||||
Item: "items",
|
||||
JournalEntry: "journal",
|
||||
Macro: "macros",
|
||||
Playlist: "playlists",
|
||||
RollTable: "tables",
|
||||
Scene: "scenes",
|
||||
Setting: "settings",
|
||||
User: "users"
|
||||
};
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Compiling */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Compile source files into a compendium pack.
|
||||
* @param {string} src The directory containing the source files.
|
||||
* @param {string} dest The target compendium pack. This should be a directory for LevelDB packs, or a .db file for
|
||||
* NeDB packs.
|
||||
* @param {CompileOptions} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
export async function compilePack(src, dest, {
|
||||
nedb=false, yaml=false, recursive=false, log=false, transformEntry
|
||||
}={}) {
|
||||
if ( nedb && (path.extname(dest) !== ".db") ) {
|
||||
throw new Error("The nedb option was passed to compilePacks, but the target pack does not have a .db extension.");
|
||||
}
|
||||
const files = findSourceFiles(src, { yaml, recursive });
|
||||
if ( nedb ) return compileNedb(dest, files, { log, transformEntry });
|
||||
return compileClassicLevel(dest, files, { log, transformEntry });
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Compile a set of files into a NeDB compendium pack.
|
||||
* @param {string} pack The target compendium pack.
|
||||
* @param {string[]} files The source files.
|
||||
* @param {Partial<PackageOptions>} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function compileNedb(pack, files, { log, transformEntry }={}) {
|
||||
// Delete the existing NeDB file if it exists.
|
||||
try {
|
||||
fs.unlinkSync(pack);
|
||||
} catch ( err ) {
|
||||
if ( err.code !== "ENOENT" ) throw err;
|
||||
}
|
||||
|
||||
// Create a new NeDB Datastore.
|
||||
const db = Datastore.create(pack);
|
||||
const seenKeys = new Set();
|
||||
const packDoc = applyHierarchy(doc => {
|
||||
if ( seenKeys.has(doc._key) ) {
|
||||
throw new Error(`An entry with key '${key}' was already packed and would be overwritten by this entry.`);
|
||||
}
|
||||
seenKeys.add(doc._key);
|
||||
delete doc._key;
|
||||
});
|
||||
|
||||
// Iterate over all source files, writing them to the DB.
|
||||
for ( const file of files ) {
|
||||
try {
|
||||
const contents = fs.readFileSync(file, "utf8");
|
||||
const ext = path.extname(file);
|
||||
const isYaml = ext === ".yml" || ext === ".yaml";
|
||||
const doc = isYaml ? YAML.load(contents) : JSON.parse(contents);
|
||||
const key = doc._key;
|
||||
const [, collection] = key.split("!");
|
||||
// If the key starts with !folders, we should skip packing it as NeDB doesn't support folders.
|
||||
if ( key.startsWith("!folders") ) continue;
|
||||
if ( await transformEntry?.(doc) === false ) continue;
|
||||
await packDoc(doc, collection);
|
||||
await db.insert(doc);
|
||||
if ( log ) console.log(`Packed ${chalk.blue(doc._id)}${chalk.blue(doc.name ? ` (${doc.name})` : "")}`);
|
||||
} catch ( err ) {
|
||||
if ( log ) console.error(`Failed to pack ${chalk.red(file)}. See error below.`);
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
// Compact the DB.
|
||||
db.stopAutocompaction();
|
||||
await new Promise(resolve => db.compactDatafile(resolve));
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Compile a set of files into a LevelDB compendium pack.
|
||||
* @param {string} pack The target compendium pack.
|
||||
* @param {string[]} files The source files.
|
||||
* @param {Partial<PackageOptions>} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function compileClassicLevel(pack, files, { log, transformEntry }={}) {
|
||||
// Create the classic level directory if it doesn't already exist.
|
||||
fs.mkdirSync(pack, { recursive: true });
|
||||
|
||||
// Load the directory as a ClassicLevel DB.
|
||||
const db = new ClassicLevel(pack, { keyEncoding: "utf8", valueEncoding: "json" });
|
||||
const batch = db.batch();
|
||||
const seenKeys = new Set();
|
||||
|
||||
const packDoc = applyHierarchy(async (doc, collection) => {
|
||||
const key = doc._key;
|
||||
delete doc._key;
|
||||
if ( seenKeys.has(key) ) {
|
||||
throw new Error(`An entry with key '${key}' was already packed and would be overwritten by this entry.`);
|
||||
}
|
||||
seenKeys.add(key);
|
||||
const value = structuredClone(doc);
|
||||
await mapHierarchy(value, collection, d => d._id);
|
||||
batch.put(key, value);
|
||||
});
|
||||
|
||||
// Iterate over all files in the input directory, writing them to the DB.
|
||||
for ( const file of files ) {
|
||||
try {
|
||||
const contents = fs.readFileSync(file, "utf8");
|
||||
const ext = path.extname(file);
|
||||
const isYaml = ext === ".yml" || ext === ".yaml";
|
||||
const doc = isYaml ? YAML.load(contents) : JSON.parse(contents);
|
||||
const [, collection] = doc._key.split("!");
|
||||
if ( await transformEntry?.(doc) === false ) continue;
|
||||
await packDoc(doc, collection);
|
||||
if ( log ) console.log(`Packed ${chalk.blue(doc._id)}${chalk.blue(doc.name ? ` (${doc.name})` : "")}`);
|
||||
} catch ( err ) {
|
||||
if ( log ) console.error(`Failed to pack ${chalk.red(file)}. See error below.`);
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
// Remove any entries in the DB that are not part of the source set.
|
||||
for ( const key of await db.keys().all() ) {
|
||||
if ( !seenKeys.has(key) ) {
|
||||
batch.del(key);
|
||||
if ( log ) console.log(`Removed ${chalk.blue(key)}`);
|
||||
}
|
||||
}
|
||||
|
||||
await batch.write();
|
||||
await compactClassicLevel(db);
|
||||
await db.close();
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Flushes the log of the given database to create compressed binary tables.
|
||||
* @param {ClassicLevel} db The database to compress.
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function compactClassicLevel(db) {
|
||||
const forwardIterator = db.keys({ limit: 1, fillCache: false });
|
||||
const firstKey = await forwardIterator.next();
|
||||
await forwardIterator.close();
|
||||
|
||||
const backwardIterator = db.keys({ limit: 1, reverse: true, fillCache: false });
|
||||
const lastKey = await backwardIterator.next();
|
||||
await backwardIterator.close();
|
||||
|
||||
if ( firstKey && lastKey ) return db.compactRange(firstKey, lastKey, { keyEncoding: "utf8" });
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Extracting */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Extract the contents of a compendium pack into individual source files for each primary Document.
|
||||
* @param {string} src The source compendium pack. This should be a directory for LevelDB pack, or a .db file for
|
||||
* NeDB packs.
|
||||
* @param {string} dest The directory to write the extracted files into.
|
||||
* @param {ExtractOptions} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
export async function extractPack(src, dest, {
|
||||
nedb=false, yaml=false, yamlOptions={}, jsonOptions={}, log=false, documentType, collection, clean, transformEntry,
|
||||
transformName
|
||||
}={}) {
|
||||
if ( nedb && (path.extname(src) !== ".db") ) {
|
||||
throw new Error("The nedb option was passed to extractPacks, but the target pack does not have a .db extension.");
|
||||
}
|
||||
collection ??= TYPE_COLLECTION_MAP[documentType];
|
||||
if ( nedb && !collection ) {
|
||||
throw new Error("For NeDB operations, a documentType or collection must be provided.");
|
||||
}
|
||||
if ( clean ) fs.rmSync(dest, { force: true, recursive: true, maxRetries: 10 });
|
||||
// Create the output directory if it doesn't exist already.
|
||||
fs.mkdirSync(dest, { recursive: true });
|
||||
if ( nedb ) {
|
||||
return extractNedb(src, dest, { yaml, yamlOptions, jsonOptions, log, collection, transformEntry, transformName });
|
||||
}
|
||||
return extractClassicLevel(src, dest, { yaml, log, yamlOptions, jsonOptions, transformEntry, transformName });
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Extract a NeDB compendium pack into individual source files for each primary Document.
|
||||
* @param {string} pack The source compendium pack.
|
||||
* @param {string} dest The root output directory.
|
||||
* @param {Partial<ExtractOptions>} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function extractNedb(pack, dest, {
|
||||
yaml, yamlOptions, jsonOptions, log, collection, transformEntry, transformName
|
||||
}={}) {
|
||||
// Load the NeDB file.
|
||||
const db = new Datastore({ filename: pack, autoload: true });
|
||||
|
||||
const unpackDoc = applyHierarchy((doc, collection, { sublevelPrefix, idPrefix }={}) => {
|
||||
const sublevel = keyJoin(sublevelPrefix, collection);
|
||||
const id = keyJoin(idPrefix, doc._id);
|
||||
doc._key = `!${sublevel}!${id}`;
|
||||
return { sublevelPrefix: sublevel, idPrefix: id };
|
||||
});
|
||||
|
||||
// Iterate over all entries in the DB, writing them as source files.
|
||||
const docs = await db.find({});
|
||||
for ( const doc of docs ) {
|
||||
await unpackDoc(doc, collection);
|
||||
if ( await transformEntry?.(doc) === false ) continue;
|
||||
let name = await transformName?.(doc);
|
||||
if ( !name ) {
|
||||
name = `${doc.name ? `${getSafeFilename(doc.name)}_${doc._id}` : doc._id}.${yaml ? "yml" : "json"}`;
|
||||
}
|
||||
const filename = path.join(dest, name);
|
||||
serializeDocument(doc, filename, { yaml, yamlOptions, jsonOptions });
|
||||
if ( log ) console.log(`Wrote ${chalk.blue(name)}`);
|
||||
}
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Extract a LevelDB pack into individual source files for each primary Document.
|
||||
* @param {string} pack The source compendium pack.
|
||||
* @param {string} dest The root output directory.
|
||||
* @param {Partial<ExtractOptions>} [options]
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function extractClassicLevel(pack, dest, {
|
||||
yaml, yamlOptions, jsonOptions, log, transformEntry, transformName
|
||||
}) {
|
||||
// Load the directory as a ClassicLevel DB.
|
||||
const db = new ClassicLevel(pack, { keyEncoding: "utf8", valueEncoding: "json" });
|
||||
|
||||
const unpackDoc = applyHierarchy(async (doc, collection, { sublevelPrefix, idPrefix }={}) => {
|
||||
const sublevel = keyJoin(sublevelPrefix, collection);
|
||||
const id = keyJoin(idPrefix, doc._id);
|
||||
doc._key = `!${sublevel}!${id}`;
|
||||
await mapHierarchy(doc, collection, (embeddedId, embeddedCollectionName) => {
|
||||
return db.get(`!${sublevel}.${embeddedCollectionName}!${id}.${embeddedId}`);
|
||||
});
|
||||
return { sublevelPrefix: sublevel, idPrefix: id };
|
||||
});
|
||||
|
||||
// Iterate over all entries in the DB, writing them as source files.
|
||||
for await ( const [key, doc] of db.iterator() ) {
|
||||
const [, collection, id] = key.split("!");
|
||||
if ( collection.includes(".") ) continue; // This is not a primary document, skip it.
|
||||
await unpackDoc(doc, collection);
|
||||
if ( await transformEntry?.(doc) === false ) continue;
|
||||
let name = await transformName?.(doc);
|
||||
if ( !name ) {
|
||||
name = `${doc.name ? `${getSafeFilename(doc.name)}_${id}` : key}.${yaml ? "yml" : "json"}`;
|
||||
}
|
||||
const filename = path.join(dest, name);
|
||||
serializeDocument(doc, filename, { yaml, yamlOptions, jsonOptions });
|
||||
if ( log ) console.log(`Wrote ${chalk.blue(name)}`);
|
||||
}
|
||||
|
||||
await db.close();
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
/* Utilities */
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Wrap a function so that it can be applied recursively to a Document's hierarchy.
|
||||
* @param {HierarchyApplyCallback} fn The function to wrap.
|
||||
* @returns {HierarchyApplyCallback} The wrapped function.
|
||||
*/
|
||||
function applyHierarchy(fn) {
|
||||
const apply = async (doc, collection, options={}) => {
|
||||
const newOptions = await fn(doc, collection, options);
|
||||
for ( const [embeddedCollectionName, type] of Object.entries(HIERARCHY[collection] ?? {}) ) {
|
||||
const embeddedValue = doc[embeddedCollectionName];
|
||||
if ( Array.isArray(type) && Array.isArray(embeddedValue) ) {
|
||||
for ( const embeddedDoc of embeddedValue ) await apply(embeddedDoc, embeddedCollectionName, newOptions);
|
||||
}
|
||||
else if ( embeddedValue ) await apply(embeddedValue, embeddedCollectionName, newOptions);
|
||||
}
|
||||
};
|
||||
return apply;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Transform a Document's embedded collections by applying a function to them.
|
||||
* @param {object} doc The Document being operated on.
|
||||
* @param {string} collection The Document's collection.
|
||||
* @param {HierarchyMapCallback} fn The function to invoke.
|
||||
*/
|
||||
async function mapHierarchy(doc, collection, fn) {
|
||||
for ( const [embeddedCollectionName, type] of Object.entries(HIERARCHY[collection] ?? {}) ) {
|
||||
const embeddedValue = doc[embeddedCollectionName];
|
||||
if ( Array.isArray(type) ) {
|
||||
if ( Array.isArray(embeddedValue) ) {
|
||||
doc[embeddedCollectionName] = await Promise.all(embeddedValue.map(entry => {
|
||||
return fn(entry, embeddedCollectionName);
|
||||
}));
|
||||
}
|
||||
else doc[embeddedCollectionName] = [];
|
||||
} else {
|
||||
if ( embeddedValue ) doc[embeddedCollectionName] = await fn(embeddedValue, embeddedCollectionName);
|
||||
else doc[embeddedCollectionName] = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Locate all source files in the given directory.
|
||||
* @param {string} root The root directory to search in.
|
||||
* @param {Partial<CompileOptions>} [options]
|
||||
* @returns {string[]}
|
||||
*/
|
||||
function findSourceFiles(root, { yaml=false, recursive=false }={}) {
|
||||
const files = [];
|
||||
for ( const entry of fs.readdirSync(root, { withFileTypes: true }) ) {
|
||||
const name = path.join(root, entry.name);
|
||||
if ( entry.isDirectory() && recursive ) {
|
||||
files.push(...findSourceFiles(name, { yaml, recursive }));
|
||||
continue;
|
||||
}
|
||||
if ( !entry.isFile() ) continue;
|
||||
const ext = path.extname(name);
|
||||
const isYaml = (ext === ".yml") || (ext === ".yaml");
|
||||
if ( yaml && isYaml ) files.push(name);
|
||||
else if ( !yaml && (ext === ".json") ) files.push(name);
|
||||
}
|
||||
return files;
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Serialize a Document and write it to the filesystem.
|
||||
* @param {object} doc The Document to serialize.
|
||||
* @param {string} filename The filename to write it to.
|
||||
* @param {Partial<ExtractOptions>} [options] Options to configure serialization behavior.
|
||||
*/
|
||||
function serializeDocument(doc, filename, { yaml, yamlOptions, jsonOptions }={}) {
|
||||
fs.mkdirSync(path.dirname(filename), { recursive: true });
|
||||
let serialized;
|
||||
if ( yaml ) serialized = YAML.dump(doc, yamlOptions);
|
||||
else {
|
||||
const { replacer=null, space=2 } = jsonOptions;
|
||||
serialized = JSON.stringify(doc, replacer, space);
|
||||
}
|
||||
fs.writeFileSync(filename, serialized + "\n");
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Join non-blank key parts.
|
||||
* @param {...string} args Key parts.
|
||||
* @returns {string}
|
||||
*/
|
||||
function keyJoin(...args) {
|
||||
return args.filter(_ => _).join(".");
|
||||
}
|
||||
|
||||
/* -------------------------------------------- */
|
||||
|
||||
/**
|
||||
* Ensure a string is safe for use as a filename.
|
||||
* @param {string} filename The filename to sanitize
|
||||
* @returns {string} The sanitized filename
|
||||
*/
|
||||
function getSafeFilename(filename) {
|
||||
return filename.replace(/[^a-zA-Z0-9А-я]/g, '_');
|
||||
}
|
34
node_modules/@foundryvtt/foundryvtt-cli/package.json
generated
vendored
Normal file
34
node_modules/@foundryvtt/foundryvtt-cli/package.json
generated
vendored
Normal file
@ -0,0 +1,34 @@
|
||||
{
|
||||
"name": "@foundryvtt/foundryvtt-cli",
|
||||
"productName": "Foundry VTT CLI",
|
||||
"description": "The Official CLI for Foundry VTT",
|
||||
"version": "1.0.3",
|
||||
"author": {
|
||||
"name": "Foundry Gaming LLC",
|
||||
"email": "admin@foundryvtt.com",
|
||||
"url": "https://foundryvtt.com"
|
||||
},
|
||||
"main": "index.mjs",
|
||||
"bin": {
|
||||
"fvtt": "fvtt.mjs"
|
||||
},
|
||||
"homepage": "https://foundryvtt.com",
|
||||
"license": "MIT",
|
||||
"private": false,
|
||||
"dependencies": {
|
||||
"chalk": "^5.2.0",
|
||||
"classic-level": "^1.2.0",
|
||||
"esm": "^3.2.25",
|
||||
"js-yaml": "^4.1.0",
|
||||
"mkdirp": "^3.0.0",
|
||||
"nedb-promises": "^6.2.1",
|
||||
"yargs": "^17.7.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"eslint": "^8.47.0",
|
||||
"eslint-plugin-jsdoc": "^46.4.6"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">17.0.0"
|
||||
}
|
||||
}
|
Reference in New Issue
Block a user