Change endpoint from persons to people

This commit is contained in:
xfarrow
2025-03-23 21:00:08 +01:00
parent 4ae263662c
commit d005193f63
7158 changed files with 700476 additions and 735 deletions

2380
backend/apis/nodejs/node_modules/knex/CHANGELOG.md generated vendored Normal file

File diff suppressed because it is too large Load Diff

194
backend/apis/nodejs/node_modules/knex/CONTRIBUTING.md generated vendored Normal file
View File

@ -0,0 +1,194 @@
## How to contribute to Knex.js
- Make changes in the `/lib` directory.
- Before sending a pull request for a feature or bug fix, be sure to have
[tests](https://github.com/knex/knex/tree/master/test). Every pull request that changes the queries should have
also **integration tests which are ran against real database** (in addition to unit tests which checks which kind of queries
are being created).
- Use the same coding style as the rest of the
[codebase](https://github.com/knex/knex/blob/master/knex.js).
- All pull requests should be made to the `master` branch.
- Pull request description should have link to corresponding PR of documentation branch.
- All pull requests that modify the public API should be updated in [types/index.d.ts](https://github.com/knex/knex/blob/master/types/index.d.ts)
## Documentation
Documentation is no longer maintained in knex master repository. All the documentation pull requests should be sent to https://github.com/knex/documentation
Documentation pull requests should not be merged before knex version which has the new documented feature is released.
## I would like to add support for new dialect to knex, is it possible?
Currently there are already way too many dialects supported in `knex` and instead of adding new dialect to central codebase, all the dialects should be moved to separate npm packages out from `knex` core library with their respective maintainers and test suites.
So if you like to write your own dialect, you can just inherit own dialect from knex base classes and use it by passing dialect to knex in knex configuration (https://runkit.com/embed/90b3cpyr4jh2):
```js
// simple dialect overriding sqlite3 dialect to use sqlite3-offline driver
require('sqlite3-offline');
const Knex = require('knex');
const Dialect = require(`knex/lib/dialects/sqlite3/index.js`);
Dialect.prototype._driver = () => require('sqlite3-offline');
const knex = Knex({
client: Dialect,
connection: ':memory:',
});
console.log(knex.select(knex.raw(1)).toSQL());
await knex.schema.createTable('fooobar', (t) => {
t.bigincrements('id');
t.string('data');
});
await knex('fooobar').insert({ data: 'nomnom' });
console.log('Gimme all the data:', await knex('fooobar'));
```
## What is minimal code to reproduce bug and why I have to provide that when I can just tell whats the problem is
Writing minimal reproduction code for the problem is time-consuming and sometimes it is also really hard, for
example when the original code where the bug happens is written using express or mocha. So why is it necessary
for me to commit so much time to it when the problem is in `knex`? Contributors should be grateful that I reported
the bug I found.
The point of runnable code to reproduce the problem is to easily verify that there really is a problem and that the one
who did the report did nothing wrong (surprisingly often problem is in the user code). So instead of just description
what to do the complete code encourages devs to actually test out that problem exists and start solving it and it
saves lots of time.
tl;dr list:
1. Actually in most of the cases developer already figures out what was the problem when writing the minimal test case
or if there was problem how stuff was initialized or how async code was written it is easy to point out the problem.
2. It motivates developer to actually try out if the bug really exist by not having to figure out from incomplete example
environment in which and how bug actually manifests.
3. There are currently very few people fixing knex issues and if one has to put easily 15-30 minutes time to issue just
to see that I cannot reproduce this issue it just wastes development hours that were available for improving knex.
Test case should initialize needed tables, insert needed data and fail...
```js
const knex = require('knex')({
client: 'pg',
connection: 'postgres:///knex_test'
});
async function main() {
await knex.schema.createTable(...);
await knex('table').insert({foo: 'bar}');
await knex.destroy();
}
main();
```
Usually issues without reproduction code available are just closed and if the same issue is reported multiple
times maybe someone looks into it.
One easy way to setup database for your reproduction is to use database from knex's docker-compose setup (npm run db:start) and by checking the connection settings from tests' `test/knexfile.js`.
## Integration Tests
### The Easy Way
By default, Knex runs tests against sqlite3, postgresql, mysql, mysql2, mssql and oracledb drivers. All databases can be initialized and ran with docker.
Docker databases can be started and initialized with:
```bash
npm run db:start
```
and stopped with:
```bash
npm run db:stop
```
In case you don't need all of the databases, you can use simplified dev Docker configuration that only runs PostgreSQL, by running `npm run db:start:postgres` and `npm run db:stop:postgres` accordingly.
### Installing support for oracledb
Oracle has started providing precompiled driver libs for all the platforms, which makes it viable to run oracle tests also locally against oracledb running in docker.
Check message when running
```bash
npm install oracledb
```
and download driver library binary packages and unzip it to ~/lib directory.
### Specifying Databases
You can optionally specify which dialects to test using the `DB` environment variable. Values should be space separated and can include:
- mysql
- mysql2
- postgres
- sqlite3
- oracledb
- mssql
```bash
$ DB='postgres mysql' npm test
```
### Custom Configuration
If you'd like to override the database configuration (to use a different host, for example), you can override the path to the [default test configuration](https://github.com/knex/knex/blob/master/test/knexfile.js) using the `KNEX_TEST` environment variable.
```bash
$ KNEX_TEST='./path/to/my/config.js' npm test
```
### Creating Postgres User
If you are running tests against own local database one might need to setup test user and database for knex to connect.
To create a new user, login to Postgres and use the following queries to add the user. This assumes you've already created the `knex_test` database.
```
CREATE ROLE postgres WITH LOGIN PASSWORD '';
GRANT ALL PRIVILEGES ON DATABASE "knex_test" TO postgres;
```
Once this is done, check it works by attempting to login:
```
psql -h localhost -U postgres -d knex_test
```
## Typescript source files
> TL;DR: Starting with release 2.0.0 Knex is adding support for Typescript source files. Thus to develop in this repo you will need to run `npm run build` each time you edit `.ts` files to generate the resulting `.js` files. This is automatically run whenever you run `npm install` or checkout a new Git branch so when developing in Javascript you don't have to worry about it. It is encouraged that new functionality and sources be written in Typescript but this is not required.
Starting with release 2.0.0, Knex is support source additions in Typescript! This allows for better safety in the code added. However, pre-2.0.0 Knex was always written in pure Javascript and thus a "hybrid" approach is being used for 2.0.0 to allow for the new `.ts` files to exist along `.js` files that make up the majority of this repository.
To develop in this repository use the `npm run build` and `npm run clean` commands to compile and delete the `.js` and related files from `.ts` files. If you wish to have the `tsc` compiled watch and recompile on changes then run `npm run build:ts -- --watch`. Note that for easy integration with Javascript the outputted files are done in a "side-by-side" manner meaning that `lib/foo/bar.ts` will result in `lib/foo/bar.js`. This is done automatically via the npm script command `"prepare"` whenever you run `npm install` and Git hook for `post-checkout` (added by Husky) which executes when you run commands like `git checkout` , thus making it easier to not have to worry about this if you're working in pure Javascript.
The script file `./scripts/update_gitignore_for_tsc_output.js` file is called as part of the `npm run build` command which will update the `lib/.gitignore` file which is used to ensure generated `.js` and related files from `tsc` compilation are not checked into the git repo.
## Want to be Collaborator?
There is always room for more collaborators. Be active on resolving github issues / sending pull requests / reviewing code and we will ask you to join.
### Etiquette (/ˈɛtᵻkɛt/ or /ˈɛtᵻkɪt/, French: [e.ti.kɛt])
Make pull requests for your changes, do not commit directly to master (release stuff like fixing changelog are ok though).
All the pull requests must be peer reviewed by other collaborator, so don't merge your request before that. If there is no response ping others.
If you are going to add new feature to knex (not just a bugfix) it should be discussed first with others to agree on details.
Join Gitter chat if you feel to chat outside of github issues.

22
backend/apis/nodejs/node_modules/knex/LICENSE generated vendored Normal file
View File

@ -0,0 +1,22 @@
Copyright (c) 2013-present Tim Griesser
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.

149
backend/apis/nodejs/node_modules/knex/README.md generated vendored Normal file
View File

@ -0,0 +1,149 @@
# [knex.js](https://knex.github.io/documentation/)
[![npm version](http://img.shields.io/npm/v/knex.svg)](https://npmjs.org/package/knex)
[![npm downloads](https://img.shields.io/npm/dm/knex.svg)](https://npmjs.org/package/knex)
![](https://github.com/knex/knex/workflows/CI/badge.svg)
[![Coverage Status](https://coveralls.io/repos/knex/knex/badge.svg?branch=master)](https://coveralls.io/r/knex/knex?branch=master)
[![Dependencies Status](https://img.shields.io/librariesio/github/knex/knex)](https://libraries.io/npm/knex)
[![Gitter chat](https://badges.gitter.im/tgriesser/knex.svg)](https://gitter.im/tgriesser/knex)
> **A SQL query builder that is _flexible_, _portable_, and _fun_ to use!**
A batteries-included, multi-dialect (PostgreSQL, MySQL, CockroachDB, MSSQL, SQLite3, Oracle (including Oracle Wallet Authentication)) query builder for
Node.js, featuring:
- [transactions](https://knex.github.io/documentation/#Transactions)
- [connection pooling](https://knex.github.io/documentation/#Installation-pooling)
- [streaming queries](https://knex.github.io/documentation/#Interfaces-Streams)
- both a [promise](https://knex.github.io/documentation/#Interfaces-Promises) and [callback](https://knex.github.io/documentation/#Interfaces-Callbacks) API
- a [thorough test suite](https://github.com/knex/knex/actions)
Node.js versions 12+ are supported.
- Take a look at the [full documentation](https://knex.github.io/documentation) to get started!
- Browse the [list of plugins and tools](https://github.com/knex/knex/blob/master/ECOSYSTEM.md) built for knex
- Check out our [recipes wiki](https://github.com/knex/knex/wiki/Recipes) to search for solutions to some specific problems
- In case of upgrading from an older version, see [migration guide](https://github.com/knex/knex/blob/master/UPGRADING.md)
You can report bugs and discuss features on the [GitHub issues page](https://github.com/knex/knex/issues) or send tweets to [@kibertoad](http://twitter.com/kibertoad).
For support and questions, join our [Gitter channel](https://gitter.im/tgriesser/knex).
For knex-based Object Relational Mapper, see:
- https://github.com/Vincit/objection.js
- https://github.com/mikro-orm/mikro-orm
- https://bookshelfjs.org
To see the SQL that Knex will generate for a given query, you can use [Knex Query Lab](https://michaelavila.com/knex-querylab/)
## Examples
We have several examples [on the website](http://knexjs.org). Here is the first one to get you started:
```js
const knex = require('knex')({
client: 'sqlite3',
connection: {
filename: './data.db',
},
});
try {
// Create a table
await knex.schema
.createTable('users', (table) => {
table.increments('id');
table.string('user_name');
})
// ...and another
.createTable('accounts', (table) => {
table.increments('id');
table.string('account_name');
table.integer('user_id').unsigned().references('users.id');
});
// Then query the table...
const insertedRows = await knex('users').insert({ user_name: 'Tim' });
// ...and using the insert id, insert into the other table.
await knex('accounts').insert({
account_name: 'knex',
user_id: insertedRows[0],
});
// Query both of the rows.
const selectedRows = await knex('users')
.join('accounts', 'users.id', 'accounts.user_id')
.select('users.user_name as user', 'accounts.account_name as account');
// map over the results
const enrichedRows = selectedRows.map((row) => ({ ...row, active: true }));
// Finally, add a catch statement
} catch (e) {
console.error(e);
}
```
## TypeScript example
```ts
import { Knex, knex } from 'knex';
interface User {
id: number;
age: number;
name: string;
active: boolean;
departmentId: number;
}
const config: Knex.Config = {
client: 'sqlite3',
connection: {
filename: './data.db',
},
};
const knexInstance = knex(config);
try {
const users = await knex<User>('users').select('id', 'age');
} catch (err) {
// error handling
}
```
## Usage as ESM module
If you are launching your Node application with `--experimental-modules`, `knex.mjs` should be picked up automatically and named ESM import should work out-of-the-box.
Otherwise, if you want to use named imports, you'll have to import knex like this:
```js
import { knex } from 'knex/knex.mjs';
```
You can also just do the default import:
```js
import knex from 'knex';
```
If you are not using TypeScript and would like the IntelliSense of your IDE to work correctly, it is recommended to set the type explicitly:
```js
/**
* @type {Knex}
*/
const database = knex({
client: 'mysql',
connection: {
host: '127.0.0.1',
user: 'your_database_user',
password: 'your_database_password',
database: 'myapp_test',
},
});
database.migrate.latest();
```

245
backend/apis/nodejs/node_modules/knex/UPGRADING.md generated vendored Normal file
View File

@ -0,0 +1,245 @@
## Upgrading to new knex.js versions
### Upgrading to version 2.0.0+
- Since `sqlite3` is maintained again, we switched back to it. If you are using `@vscode/sqlite3` driver dependency, please replace it with `sqlite3` in your `package.json`;
### Upgrading to version 1.0.0+
- Node.js older than 12 is no longer supported, make sure to update your environment;
- If you are using `sqlite3` driver dependency, please replace it with `@vscode/sqlite3` in your `package.json`;
- `RETURNING` operations now always return an object with column names;
- Migrator now returns list of migrations as objects.
### Upgrading to version 0.95.0+
- TypeScript type exports changed significantly. While `import Knex from 'knex';` used to import the knex instantiation function, the namespace and the interface for the knex instantiation function/object, there is now a clear distinction between them:
```typescript
import { knex } from 'knex'; // this is a function that you call to instantiate knex
import { Knex } from 'knex'; // this is a namespace, and a type of a knex object
import KnexTimeoutError = Knex.KnexTimeoutError; // this is a class from the Knex namespace
const config: Knex.Config = {}; // this is a type from the Knex namespace
const knexInstance: Knex = knex(config);
```
If your code looked like this:
```typescript
import knex from 'knex';
const config: knex.Config = {}; // this is a type from the Knex namespace
const knexInstance = knex(config);
```
Change it to
```typescript
import { knex, Knex } from 'knex';
const config: Knex.Config = {}; // this is a type from the Knex namespace
const knexInstance = knex(config);
```
- If you were importing types such as `Config` or `QueryBuilder` directly, use `Knex` namespace instead.
So change this:
```ts
import { QueryBuilder } from 'knex';
const qb: QueryBuilder = knex('table').select('*');
```
to this:
```ts
import { Knex } from 'knex';
const qb: Knex.QueryBuilder = knex('table').select('*');
```
- IDE autocomplete may stop working if you are using JavaScript (not TypeScript). There are reports for autocomplete still working correctly if knex is used this way:
```js
const knex = require('knex').knex({
//connection parameters
});
```
It also works when using ESM imports:
```js
import { knex } from 'knex';
const kn = knex({
//connection parameters
});
```
For usage as param it can be addressed like this:
```js
/**
* @param {import("knex").Knex} db
*/
function up(db) {
// Your code
}
```
- Syntax for QueryBuilder augmentation changed. Previously it looked like this:
```ts
declare module 'knex' {
interface QueryBuilder {
paginate<TResult = any[]>(
params: IPaginateParams
): KnexQB<any, IWithPagination<TResult>>;
}
}
```
This should be changed into this:
```ts
declare module 'knex' {
namespace Knex {
interface QueryBuilder {
paginate<TResult = any[]>(
params: IPaginateParams
): KnexQB<any, IWithPagination<TResult>>;
}
}
}
```
- TypeScript version 4.1+ is needed when using knex types now.
- MSSQL driver was completely reworked in order to address the multitude of connection pool, error handling and performance issues. Since the new implementation uses `tedious` library directly instead of `mssql`, please replace `mssql` with `tedious` in your dependencies if you are using a MSSQL database.
- Transaction rollback does not trigger a promise rejection for transactions with specified handler. If you want to preserve previous behavior, pass `config` object with `doNotRejectOnRollback: false`:
```js
await knex.transaction(
async (trx) => {
const ids = await trx('catalogues').insert({ name: 'Old Books' }, 'id');
},
{ doNotRejectOnRollback: false }
);
```
- Connection url parsing changed from legacy [url.parse](https://nodejs.org/docs/latest-v10.x/api/url.html#url_legacy_url_api) to [WHATWG URL](https://nodejs.org/docs/latest-v10.x/api/url.html#url_the_whatwg_url_api). If you have symbols, unusual for a URL (not A-z, not digits, not dot, not dash) - check [Node.js docs](https://nodejs.org/docs/latest-v10.x/api/url.html#url_percent_encoding_in_urls) for details
- Global static `Knex.raw` support dropped, use instance `knex.raw` instead. (`require('knex').raw()` won't work anymore)
- v8 flags are no longer supported in cli. To pass these flags use [`NODE_OPTIONS` environment variable](https://nodejs.org/api/cli.html#cli_node_options_options).
For example `NODE_OPTIONS="--max-old-space-size=1536" npm run knex`
- Clients are now classes instead of new-able functions. Please migrate your custom clients to classes.
```js
const Client = require('knex');
const { inherits } = require('util');
// old
function CustomClient(config) {
Client.call(this, config);
// construction logic
}
inherits(CustomClient, Client);
CustomClient.prototype.methodOverride = function () {
// logic
};
// new
class CustomClient extends Client {
// node 12+
driverName = 'abcd';
constructor(config) {
super(config);
this.driverName = 'abcd'; // bad way, will not work
// construction logic
}
methodOverride() {
// logic
}
}
// alternative to declare driverName
CustomClient.prototype.driverName = 'abcd';
```
- There was a major internal restructuring and renaming effort. Most dialect-specific compilers/builder have dialect name as a prefix now. Also some files were moved. Make sure to make adjustments accordingly if you were referencing specific knex library files directly from your code.
- "first" and "pluck" can no longer be both chained on the same operation. Previously only the last one chained was used, now this would throw an error.
- Trying to execute an operation resulting in an empty query such as inserting an empty array, will now throw an error on all database drivers.
### Upgrading to version 0.21.0+
- Node.js older than 10 is no longer supported, make sure to update your environment;
### Upgrading to version 0.19.0+
- Passing unknown properties to connection pool configuration now throws errors (see https://github.com/Vincit/tarn.js/issues/19 for details);
- `beforeDestroy` pool configuration option was removed. You should use tarn.js event handlers if you still need similar functionality.
### Upgrading to version 0.18.0+
- Node.js older than 8 is no longer supported, make sure to update your environment;
- Knex returns native promises instead of bluebird ones now. You will need to update your code not to rely on bluebird-specific functionality;
- Knex.Promise was removed, use native promises;
- Promise is no longer passed to migrations and seeds, use native one;
- If you are using TypeScript, make sure to include 'es6' in compilerOptions.lib, otherwise you may get errors for methods `.catch()` and `then()` not being recognized.
### Upgrading to version 0.17.0+
- Generic support was implemented for TypeScript bindings, which may break TS builds in some edge cases. Please refer to https://knexjs.org/#typescript-support for more elaborate documentation.
### Upgrading to version 0.16.0+
- MSSQL: DB versions older than 2008 are no longer supported, make sure to update your DB;
- PostgreSQL|MySQL: it is recommended to use options object for `table.datetime` and `table.timestamp` methods instead of argument options. See documentation for these methods for more details;
- Node 6: There are known issues with duplicate event listeners when using knex.js with Node.js 6 (resulting in MaxListenersExceededWarning under certain use-cases (such as reusing single knex instance to run migrations or seeds multiple times)). Please upgrade to Node.js 8+ as soon as possible (knex 0.17.0 will be dropping Node.js 6 support altogether);
### Upgrading to version 0.15.0+
- Node.js older than 6 is no longer supported, make sure to update your environment;
- MSSQL: Creating a unique index on the table targeted by stored procedures that were created with QUOTED_IDENTIFIER = OFF fails.
You can use this query to identify all affected stored procedures:
```
SELECT name = OBJECT_NAME([object_id]), uses_quoted_identifier
FROM sys.sql_modules
WHERE uses_quoted_identifier = 0;
```
The only known solution is to recreate all stored procedures with QUOTED_IDENTIFIER = OFF
- MariaDB: `mariadb` dialect is no longer supported;
Instead, use "mysql" or "mysql2" dialects.
### Upgrading to version 0.14.4+
- Including schema in tableName parameter in migrations no longer works, so this is invalid:
```js
await knex.migrate.latest({
directory: 'src/services/orders/database/migrations',
tableName: 'orders.orders_migrations',
});
```
Instead, starting from 0.14.5 you should use new parameter schemaName:
```js
await knex.migrate.latest({
directory: 'src/services/orders/database/migrations',
tableName: 'orders_migrations',
schemaName: 'orders',
});
```

475
backend/apis/nodejs/node_modules/knex/bin/cli.js generated vendored Executable file
View File

@ -0,0 +1,475 @@
#!/usr/bin/env node
const rechoir = require('rechoir');
const merge = require('lodash/merge');
const interpret = require('interpret');
const resolveFrom = require('resolve-from');
const path = require('path');
const tildify = require('tildify');
const commander = require('commander');
const color = require('colorette');
const argv = require('getopts')(process.argv.slice(2));
const cliPkg = require('../package');
const {
parseConfigObj,
mkConfigObj,
resolveEnvironmentConfig,
exit,
success,
checkLocalModule,
checkConfigurationOptions,
getMigrationExtension,
getSeedExtension,
getStubPath,
findUpModulePath,
findUpConfig,
} = require('./utils/cli-config-utils');
const {
existsSync,
readFile,
writeFile,
} = require('../lib/migrations/util/fs');
const { listMigrations } = require('./utils/migrationsLister');
async function openKnexfile(configPath) {
const importFile = require('../lib/migrations/util/import-file'); // require me late!
let config = await importFile(configPath);
if (config && config.default) {
config = config.default;
}
if (typeof config === 'function') {
config = await config();
}
return config;
}
async function initKnex(env, opts, useDefaultClientIfNotSpecified) {
checkLocalModule(env);
if (process.cwd() !== env.cwd) {
process.chdir(env.cwd);
console.log(
'Working directory changed to',
color.magenta(tildify(env.cwd))
);
}
if (!useDefaultClientIfNotSpecified) {
checkConfigurationOptions(env, opts);
}
env.configuration = env.configPath
? await openKnexfile(env.configPath)
: mkConfigObj(opts);
const resolvedConfig = resolveEnvironmentConfig(
opts,
env.configuration,
env.configPath
);
const optionsConfig = parseConfigObj(opts);
const config = merge(resolvedConfig, optionsConfig);
// Migrations directory gets defaulted if it is undefined.
if (!env.configPath && !config.migrations.directory) {
config.migrations.directory = null;
}
// Client gets defaulted if undefined and it's allowed
if (useDefaultClientIfNotSpecified && config.client === undefined) {
config.client = 'sqlite3';
}
const knex = require(env.modulePath);
return knex(config);
}
function invoke() {
const filetypes = ['js', 'mjs', 'coffee', 'ts', 'eg', 'ls'];
const cwd = argv.knexfile
? path.dirname(path.resolve(argv.knexfile))
: process.cwd();
// TODO add knexpath here eventually
const modulePath =
resolveFrom.silent(cwd, 'knex') ||
findUpModulePath(cwd, 'knex') ||
process.env.KNEX_PATH;
const configPath =
argv.knexfile && existsSync(argv.knexfile)
? path.resolve(argv.knexfile)
: findUpConfig(cwd, 'knexfile', filetypes);
if (configPath) {
const autoloads = rechoir.prepare(
interpret.jsVariants,
configPath,
cwd,
true
);
if (autoloads instanceof Error) {
// Only errors
autoloads.failures.forEach(function (failed) {
console.log(
color.red('Failed to load external module'),
color.magenta(failed.moduleName)
);
});
} else if (Array.isArray(autoloads)) {
const succeeded = autoloads[autoloads.length - 1];
console.log(
'Requiring external module',
color.magenta(succeeded.moduleName)
);
}
}
const env = {
cwd,
modulePath,
configPath,
configuration: null,
};
let modulePackage = {};
try {
modulePackage = require(path.join(
path.dirname(env.modulePath),
'package.json'
));
} catch (e) {
/* empty */
}
const cliVersion = [
color.blue('Knex CLI version:'),
color.green(cliPkg.version),
].join(' ');
const localVersion = [
color.blue('Knex Local version:'),
color.green(modulePackage.version || 'None'),
].join(' ');
commander
.version(`${cliVersion}\n${localVersion}`)
.option('--debug', 'Run with debugging.')
.option('--knexfile [path]', 'Specify the knexfile path.')
.option('--knexpath [path]', 'Specify the path to knex instance.')
.option('--cwd [path]', 'Specify the working directory.')
.option('--client [name]', 'Set DB client.')
.option('--connection [address]', 'Set DB connection.')
.option('--migrations-directory [path]', 'Set migrations directory.')
.option('--migrations-table-name [path]', 'Set migrations table name.')
.option(
'--env [name]',
'environment, default: process.env.NODE_ENV || development'
)
.option('--esm', 'Enable ESM interop.')
.option('--specific [path]', 'Specify one seed file to execute.')
.option(
'--timestamp-filename-prefix',
'Enable a timestamp prefix on name of generated seed files.'
);
commander
.command('init')
.description(' Create a fresh knexfile.')
.option(
`-x [${filetypes.join('|')}]`,
'Specify the knexfile extension (default js)'
)
.action(() => {
const type = (argv.x || 'js').toLowerCase();
if (filetypes.indexOf(type) === -1) {
exit(`Invalid filetype specified: ${type}`);
}
if (env.configuration) {
exit(`Error: ${env.knexfile} already exists`);
}
checkLocalModule(env);
const stubPath = `./knexfile.${type}`;
readFile(
path.dirname(env.modulePath) +
'/lib/migrations/migrate/stub/knexfile-' +
type +
'.stub'
)
.then((code) => {
return writeFile(stubPath, code);
})
.then(() => {
success(color.green(`Created ${stubPath}`));
})
.catch(exit);
});
commander
.command('migrate:make <name>')
.description(' Create a named migration file.')
.option(
`-x [${filetypes.join('|')}]`,
'Specify the stub extension (default js)'
)
.option(
`--stub [<relative/path/from/knexfile>|<name>]`,
'Specify the migration stub to use. If using <name> the file must be located in config.migrations.directory'
)
.action(async (name) => {
try {
const opts = commander.opts();
const instance = await initKnex(env, opts, true); // Skip config check, we don't really care about client when creating migrations
const ext = getMigrationExtension(env, opts);
const configOverrides = { extension: ext };
const stub = getStubPath('migrations', env, opts);
if (stub) {
configOverrides.stub = stub;
}
instance.migrate
.make(name, configOverrides)
.then((name) => {
success(color.green(`Created Migration: ${name}`));
})
.catch(exit);
} catch (err) {
exit(err);
}
});
commander
.command('migrate:latest')
.description(' Run all migrations that have not yet been run.')
.option('--verbose', 'verbose')
.action(async () => {
try {
const instance = await initKnex(env, commander.opts());
const [batchNo, log] = await instance.migrate.latest();
if (log.length === 0) {
success(color.cyan('Already up to date'));
}
success(
color.green(`Batch ${batchNo} run: ${log.length} migrations`) +
(argv.verbose ? `\n${color.cyan(log.join('\n'))}` : '')
);
} catch (err) {
exit(err);
}
});
commander
.command('migrate:up [<name>]')
.description(
' Run the next or the specified migration that has not yet been run.'
)
.action((name) => {
initKnex(env, commander.opts())
.then((instance) => instance.migrate.up({ name }))
.then(([batchNo, log]) => {
if (log.length === 0) {
success(color.cyan('Already up to date'));
}
success(
color.green(
`Batch ${batchNo} ran the following migrations:\n${log.join(
'\n'
)}`
)
);
})
.catch(exit);
});
commander
.command('migrate:rollback')
.description(' Rollback the last batch of migrations performed.')
.option('--all', 'rollback all completed migrations')
.option('--verbose', 'verbose')
.action((cmd) => {
const { all } = cmd;
initKnex(env, commander.opts())
.then((instance) => instance.migrate.rollback(null, all))
.then(([batchNo, log]) => {
if (log.length === 0) {
success(color.cyan('Already at the base migration'));
}
success(
color.green(
`Batch ${batchNo} rolled back: ${log.length} migrations`
) + (argv.verbose ? `\n${color.cyan(log.join('\n'))}` : '')
);
})
.catch(exit);
});
commander
.command('migrate:down [<name>]')
.description(
' Undo the last or the specified migration that was already run.'
)
.action((name) => {
initKnex(env, commander.opts())
.then((instance) => instance.migrate.down({ name }))
.then(([batchNo, log]) => {
if (log.length === 0) {
success(color.cyan('Already at the base migration'));
}
success(
color.green(
`Batch ${batchNo} rolled back the following migrations:\n${log.join(
'\n'
)}`
)
);
})
.catch(exit);
});
commander
.command('migrate:currentVersion')
.description(' View the current version for the migration.')
.action(() => {
initKnex(env, commander.opts())
.then((instance) => instance.migrate.currentVersion())
.then((version) => {
success(color.green('Current Version: ') + color.blue(version));
})
.catch(exit);
});
commander
.command('migrate:list')
.alias('migrate:status')
.description(' List all migrations files with status.')
.action(() => {
initKnex(env, commander.opts())
.then((instance) => {
return instance.migrate.list();
})
.then(([completed, newMigrations]) => {
listMigrations(completed, newMigrations);
})
.catch(exit);
});
commander
.command('migrate:unlock')
.description(' Forcibly unlocks the migrations lock table.')
.action(() => {
initKnex(env, commander.opts())
.then((instance) => instance.migrate.forceFreeMigrationsLock())
.then(() => {
success(
color.green(`Succesfully unlocked the migrations lock table`)
);
})
.catch(exit);
});
commander
.command('seed:make <name>')
.description(' Create a named seed file.')
.option(
`-x [${filetypes.join('|')}]`,
'Specify the stub extension (default js)'
)
.option(
`--stub [<relative/path/from/knexfile>|<name>]`,
'Specify the seed stub to use. If using <name> the file must be located in config.seeds.directory'
)
.option(
'--timestamp-filename-prefix',
'Enable a timestamp prefix on name of generated seed files.',
false
)
.action(async (name) => {
try {
const opts = commander.opts();
const instance = await initKnex(env, opts, true); // Skip config check, we don't really care about client when creating seeds
const ext = getSeedExtension(env, opts);
const configOverrides = { extension: ext };
const stub = getStubPath('seeds', env, opts);
if (stub) {
configOverrides.stub = stub;
}
if (opts.timestampFilenamePrefix) {
configOverrides.timestampFilenamePrefix =
opts.timestampFilenamePrefix;
}
instance.seed
.make(name, configOverrides)
.then((name) => {
success(color.green(`Created seed file: ${name}`));
})
.catch(exit);
} catch (err) {
exit(err);
}
});
commander
.command('seed:run')
.description(' Run seed files.')
.option('--verbose', 'verbose')
.option('--specific', 'run specific seed file')
.action(() => {
initKnex(env, commander.opts())
.then((instance) => instance.seed.run({ specific: argv.specific }))
.then(([log]) => {
if (log.length === 0) {
success(color.cyan('No seed files exist'));
}
success(
color.green(`Ran ${log.length} seed files`) +
(argv.verbose ? `\n${color.cyan(log.join('\n'))}` : '')
);
})
.catch(exit);
});
commander.parse(process.argv);
}
// FYI: The handling for the `--cwd` and `--knexfile` arguments is a bit strange,
// but we decided to retain the behavior for backwards-compatibility. In
// particular: if `--knexfile` is a relative path, then it will be resolved
// relative to `--cwd` instead of the shell's CWD.
//
// So, the easiest way to replicate this behavior is to have the CLI change
// its CWD to `--cwd` immediately before initializing everything else. This
// ensures that path.resolve will then resolve the path to `--knexfile` correctly.
if (argv.cwd) {
process.chdir(argv.cwd);
}
// Initialize 'esm' before cli.launch
if (argv.esm) {
// enable esm interop via 'esm' module
// eslint-disable-next-line no-global-assign
require = require('esm')(module);
// https://github.com/standard-things/esm/issues/868
const ext = require.extensions['.js'];
require.extensions['.js'] = (m, fileName) => {
try {
// default to the original extension
// this fails if target file parent is of type='module'
return ext(m, fileName);
} catch (err) {
if (err && err.code === 'ERR_REQUIRE_ESM') {
return m._compile(
require('fs').readFileSync(fileName, 'utf8'),
fileName
);
}
throw err;
}
};
}
invoke();

View File

@ -0,0 +1,212 @@
const { DEFAULT_EXT, DEFAULT_TABLE_NAME } = require('./constants');
const { resolveClientNameWithAliases } = require('../../lib/util/helpers');
const path = require('path');
const escalade = require('escalade/sync');
const tildify = require('tildify');
const color = require('colorette');
const argv = require('getopts')(process.argv.slice(2));
function parseConfigObj(opts) {
const config = { migrations: {} };
if (opts.client) {
config.client = opts.client;
}
if (opts.connection) {
config.connection = opts.connection;
}
if (opts.migrationsDirectory) {
config.migrations.directory = opts.migrationsDirectory;
}
if (opts.migrationsTableName) {
config.migrations.tableName = opts.migrationsTableName;
}
return config;
}
function mkConfigObj(opts) {
const envName = opts.env || process.env.NODE_ENV || 'development';
const resolvedClientName = resolveClientNameWithAliases(opts.client);
const useNullAsDefault = resolvedClientName === 'sqlite3';
const parsedConfig = parseConfigObj(opts);
return {
ext: DEFAULT_EXT,
[envName]: {
...parsedConfig,
useNullAsDefault,
tableName: parsedConfig.tableName || DEFAULT_TABLE_NAME,
},
};
}
function resolveEnvironmentConfig(opts, allConfigs, configFilePath) {
const environment = opts.env || process.env.NODE_ENV || 'development';
const result = allConfigs[environment] || allConfigs;
if (allConfigs[environment]) {
console.log('Using environment:', color.magenta(environment));
}
if (!result) {
console.log(color.red('Warning: unable to read knexfile config'));
process.exit(1);
}
if (argv.debug !== undefined) {
result.debug = argv.debug;
}
// It is safe to assume that unless explicitly specified, we would want
// migrations, seeds etc. to be generated with same extension
if (configFilePath) {
result.ext = result.ext || path.extname(configFilePath).replace('.', '');
}
return result;
}
function exit(text) {
if (text instanceof Error) {
if (text.message) {
console.error(color.red(text.message));
}
console.error(
color.red(`${text.detail ? `${text.detail}\n` : ''}${text.stack}`)
);
} else {
console.error(color.red(text));
}
process.exit(1);
}
function success(text) {
console.log(text);
process.exit(0);
}
function checkLocalModule(env) {
if (!env.modulePath) {
console.log(
color.red('No local knex install found in:'),
color.magenta(tildify(env.cwd))
);
exit('Try running: npm install knex');
}
}
function checkConfigurationOptions(env, opts) {
if (!env.configPath && !opts.client) {
throw new Error(
`No configuration file found and no commandline connection parameters passed`
);
}
}
function getMigrationExtension(env, opts) {
const config = resolveEnvironmentConfig(
opts,
env.configuration,
env.configPath
);
let ext = DEFAULT_EXT;
if (argv.x) {
ext = argv.x;
} else if (config.migrations && config.migrations.extension) {
ext = config.migrations.extension;
} else if (config.ext) {
ext = config.ext;
}
return ext.toLowerCase();
}
function getSeedExtension(env, opts) {
const config = resolveEnvironmentConfig(
opts,
env.configuration,
env.configPath
);
let ext = DEFAULT_EXT;
if (argv.x) {
ext = argv.x;
} else if (config.seeds && config.seeds.extension) {
ext = config.seeds.extension;
} else if (config.ext) {
ext = config.ext;
}
return ext.toLowerCase();
}
function getStubPath(configKey, env, opts) {
const config = resolveEnvironmentConfig(opts, env.configuration);
const stubDirectory = config[configKey] && config[configKey].directory;
const { stub } = argv;
if (!stub) {
return null;
} else if (stub.includes('/')) {
// relative path to stub
return stub;
}
// using stub <name> must have config[configKey].directory defined
if (!stubDirectory) {
console.log(color.red('Failed to load stub'), color.magenta(stub));
exit(`config.${configKey}.directory in knexfile must be defined`);
}
return path.join(stubDirectory, stub);
}
function findUpModulePath(cwd, packageName) {
const modulePackagePath = escalade(cwd, (dir, names) => {
if (names.includes('package.json')) {
return 'package.json';
}
return false;
});
try {
const modulePackage = require(modulePackagePath);
if (modulePackage.name === packageName) {
return path.join(
path.dirname(modulePackagePath),
modulePackage.main || 'index.js'
);
}
} catch (e) {
/* empty */
}
}
function findUpConfig(cwd, name, extensions) {
return escalade(cwd, (dir, names) => {
for (const ext of extensions) {
const filename = `${name}.${ext}`;
if (names.includes(filename)) {
return filename;
}
}
return false;
});
}
module.exports = {
parseConfigObj,
mkConfigObj,
resolveEnvironmentConfig,
exit,
success,
checkLocalModule,
checkConfigurationOptions,
getSeedExtension,
getMigrationExtension,
getStubPath,
findUpModulePath,
findUpConfig,
};

View File

@ -0,0 +1,7 @@
const DEFAULT_EXT = 'js';
const DEFAULT_TABLE_NAME = 'knex_migrations';
module.exports = {
DEFAULT_EXT,
DEFAULT_TABLE_NAME,
};

View File

@ -0,0 +1,37 @@
const color = require('colorette');
const { success } = require('./cli-config-utils');
function listMigrations(completed, newMigrations) {
let message = '';
if (completed.length === 0) {
message += color.red('No Completed Migration files Found.\n');
} else {
message = color.green(
`Found ${completed.length} Completed Migration file/files.\n`
);
for (let i = 0; i < completed.length; i++) {
const file = completed[i];
message += color.cyan(`${file.name}\n`);
}
}
if (newMigrations.length === 0) {
message += color.red('No Pending Migration files Found.\n');
} else {
message += color.green(
`Found ${newMigrations.length} Pending Migration file/files.\n`
);
for (let i = 0; i < newMigrations.length; i++) {
const file = newMigrations[i];
message += color.cyan(`${file.file}\n`);
}
}
success(message);
}
module.exports = { listMigrations };

23
backend/apis/nodejs/node_modules/knex/knex.js generated vendored Normal file
View File

@ -0,0 +1,23 @@
// Knex.js
// --------------
// (c) 2013-present Tim Griesser
// Knex may be freely distributed under the MIT license.
// For details and documentation:
// http://knexjs.org
const knex = require('./lib/index');
/**
* These export configurations enable JS and TS developers
* to consume knex in whatever way best suits their needs.
* Some examples of supported import syntax includes:
* - `const knex = require('knex')`
* - `const { knex } = require('knex')`
* - `import * as knex from 'knex'`
* - `import { knex } from 'knex'`
* - `import knex from 'knex'`
*/
knex.knex = knex;
knex.default = knex;
module.exports = knex;

11
backend/apis/nodejs/node_modules/knex/knex.mjs generated vendored Normal file
View File

@ -0,0 +1,11 @@
// Knex.js
// --------------
// (c) 2013-present Tim Griesser
// Knex may be freely distributed under the MIT license.
// For details and documentation:
// http://knexjs.org
import knex from './lib/index.js';
export { knex };
export default knex;

View File

@ -0,0 +1,120 @@
const clone = require('lodash/clone');
const isEmpty = require('lodash/isEmpty');
const { callbackify } = require('util');
const finallyMixin = require('./util/finally-mixin');
const { formatQuery } = require('./execution/internal/query-executioner');
function augmentWithBuilderInterface(Target) {
Target.prototype.toQuery = function (tz) {
let data = this.toSQL(this._method, tz);
if (!Array.isArray(data)) data = [data];
if (!data.length) {
return '';
}
return data
.map((statement) => {
return formatQuery(statement.sql, statement.bindings, tz, this.client);
})
.reduce((a, c) => a.concat(a.endsWith(';') ? '\n' : ';\n', c));
};
// Create a new instance of the `Runner`, passing in the current object.
Target.prototype.then = function (/* onFulfilled, onRejected */) {
let result = this.client.runner(this).run();
if (this.client.config.asyncStackTraces) {
result = result.catch((err) => {
err.originalStack = err.stack;
const firstLine = err.stack.split('\n')[0];
// a hack to get a callstack into the client code despite this
// node.js bug https://github.com/nodejs/node/issues/11865
// see lib/util/save-async-stack.js for more details
const { error, lines } = this._asyncStack;
const stackByLines = error.stack.split('\n');
const asyncStack = stackByLines.slice(lines);
asyncStack.unshift(firstLine);
// put the fake more helpful "async" stack on the thrown error
err.stack = asyncStack.join('\n');
throw err;
});
}
return result.then.apply(result, arguments);
};
// Add additional "options" to the builder. Typically used for client specific
// items, like the `mysql` and `sqlite3` drivers.
Target.prototype.options = function (opts) {
this._options = this._options || [];
this._options.push(clone(opts) || {});
return this;
};
// Sets an explicit "connection" we wish to use for this query.
Target.prototype.connection = function (connection) {
this._connection = connection;
this.client.processPassedConnection(connection);
return this;
};
// Set a debug flag for the current schema query stack.
Target.prototype.debug = function (enabled) {
this._debug = arguments.length ? enabled : true;
return this;
};
// Set the transaction object for this query.
Target.prototype.transacting = function (transaction) {
if (transaction && transaction.client) {
if (!transaction.client.transacting) {
transaction.client.logger.warn(
`Invalid transaction value: ${transaction.client}`
);
} else {
this.client = transaction.client;
}
}
if (isEmpty(transaction)) {
this.client.logger.error(
'Invalid value on transacting call, potential bug'
);
throw Error(
'Invalid transacting value (null, undefined or empty object)'
);
}
return this;
};
// Initializes a stream.
Target.prototype.stream = function (options) {
return this.client.runner(this).stream(options);
};
// Initialize a stream & pipe automatically.
Target.prototype.pipe = function (writable, options) {
return this.client.runner(this).pipe(writable, options);
};
Target.prototype.asCallback = function (cb) {
const promise = this.then();
callbackify(() => promise)(cb);
return promise;
};
Target.prototype.catch = function (onReject) {
return this.then().catch(onReject);
};
Object.defineProperty(Target.prototype, Symbol.toStringTag, {
get: () => 'object',
});
finallyMixin(Target.prototype);
}
module.exports = {
augmentWithBuilderInterface,
};

495
backend/apis/nodejs/node_modules/knex/lib/client.js generated vendored Normal file
View File

@ -0,0 +1,495 @@
const { Pool, TimeoutError } = require('tarn');
const { EventEmitter } = require('events');
const { promisify } = require('util');
const { makeEscape } = require('./util/string');
const cloneDeep = require('lodash/cloneDeep');
const defaults = require('lodash/defaults');
const uniqueId = require('lodash/uniqueId');
const Runner = require('./execution/runner');
const Transaction = require('./execution/transaction');
const {
executeQuery,
enrichQueryObject,
} = require('./execution/internal/query-executioner');
const QueryBuilder = require('./query/querybuilder');
const QueryCompiler = require('./query/querycompiler');
const SchemaBuilder = require('./schema/builder');
const SchemaCompiler = require('./schema/compiler');
const TableBuilder = require('./schema/tablebuilder');
const TableCompiler = require('./schema/tablecompiler');
const ColumnBuilder = require('./schema/columnbuilder');
const ColumnCompiler = require('./schema/columncompiler');
const { KnexTimeoutError } = require('./util/timeout');
const { outputQuery, unwrapRaw } = require('./formatter/wrappingFormatter');
const { compileCallback } = require('./formatter/formatterUtils');
const Raw = require('./raw');
const Ref = require('./ref');
const Formatter = require('./formatter');
const Logger = require('./logger');
const { POOL_CONFIG_OPTIONS } = require('./constants');
const ViewBuilder = require('./schema/viewbuilder.js');
const ViewCompiler = require('./schema/viewcompiler.js');
const isPlainObject = require('lodash/isPlainObject');
const { setHiddenProperty } = require('./util/security.js');
const debug = require('debug')('knex:client');
// The base client provides the general structure
// for a dialect specific client object.
class Client extends EventEmitter {
constructor(config = {}) {
super();
this.config = config;
this.logger = new Logger(config);
if (this.config.connection && this.config.connection.password) {
setHiddenProperty(this.config.connection);
}
//Client is a required field, so throw error if it's not supplied.
//If 'this.dialect' is set, then this is a 'super()' call, in which case
//'client' does not have to be set as it's already assigned on the client prototype.
if (this.dialect && !this.config.client) {
this.logger.warn(
`Using 'this.dialect' to identify the client is deprecated and support for it will be removed in the future. Please use configuration option 'client' instead.`
);
}
const dbClient = this.config.client || this.dialect;
if (!dbClient) {
throw new Error(
`knex: Required configuration option 'client' is missing.`
);
}
if (config.version) {
this.version = config.version;
}
if (config.connection && config.connection instanceof Function) {
this.connectionConfigProvider = config.connection;
this.connectionConfigExpirationChecker = () => true; // causes the provider to be called on first use
} else {
this.connectionSettings = cloneDeep(config.connection || {});
if (config.connection && config.connection.password) {
setHiddenProperty(this.connectionSettings, config.connection);
}
this.connectionConfigExpirationChecker = null;
}
if (this.driverName && config.connection) {
this.initializeDriver();
if (!config.pool || (config.pool && config.pool.max !== 0)) {
this.initializePool(config);
}
}
this.valueForUndefined = this.raw('DEFAULT');
if (config.useNullAsDefault) {
this.valueForUndefined = null;
}
}
formatter(builder) {
return new Formatter(this, builder);
}
queryBuilder() {
return new QueryBuilder(this);
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
schemaBuilder() {
return new SchemaBuilder(this);
}
schemaCompiler(builder) {
return new SchemaCompiler(this, builder);
}
tableBuilder(type, tableName, tableNameLike, fn) {
return new TableBuilder(this, type, tableName, tableNameLike, fn);
}
viewBuilder(type, viewBuilder, fn) {
return new ViewBuilder(this, type, viewBuilder, fn);
}
tableCompiler(tableBuilder) {
return new TableCompiler(this, tableBuilder);
}
viewCompiler(viewCompiler) {
return new ViewCompiler(this, viewCompiler);
}
columnBuilder(tableBuilder, type, args) {
return new ColumnBuilder(this, tableBuilder, type, args);
}
columnCompiler(tableBuilder, columnBuilder) {
return new ColumnCompiler(this, tableBuilder, columnBuilder);
}
runner(builder) {
return new Runner(this, builder);
}
transaction(container, config, outerTx) {
return new Transaction(this, container, config, outerTx);
}
raw() {
return new Raw(this).set(...arguments);
}
ref() {
return new Ref(this, ...arguments);
}
query(connection, queryParam) {
const queryObject = enrichQueryObject(connection, queryParam, this);
return executeQuery(connection, queryObject, this);
}
stream(connection, queryParam, stream, options) {
const queryObject = enrichQueryObject(connection, queryParam, this);
return this._stream(connection, queryObject, stream, options);
}
prepBindings(bindings) {
return bindings;
}
positionBindings(sql) {
return sql;
}
postProcessResponse(resp, queryContext) {
if (this.config.postProcessResponse) {
return this.config.postProcessResponse(resp, queryContext);
}
return resp;
}
wrapIdentifier(value, queryContext) {
return this.customWrapIdentifier(
value,
this.wrapIdentifierImpl,
queryContext
);
}
customWrapIdentifier(value, origImpl, queryContext) {
if (this.config.wrapIdentifier) {
return this.config.wrapIdentifier(value, origImpl, queryContext);
}
return origImpl(value);
}
wrapIdentifierImpl(value) {
return value !== '*' ? `"${value.replace(/"/g, '""')}"` : '*';
}
initializeDriver() {
try {
this.driver = this._driver();
} catch (e) {
const message = `Knex: run\n$ npm install ${this.driverName} --save`;
this.logger.error(`${message}\n${e.message}\n${e.stack}`);
throw new Error(`${message}\n${e.message}`);
}
}
poolDefaults() {
return { min: 2, max: 10, propagateCreateError: true };
}
getPoolSettings(poolConfig) {
poolConfig = defaults({}, poolConfig, this.poolDefaults());
POOL_CONFIG_OPTIONS.forEach((option) => {
if (option in poolConfig) {
this.logger.warn(
[
`Pool config option "${option}" is no longer supported.`,
`See https://github.com/Vincit/tarn.js for possible pool config options.`,
].join(' ')
);
}
});
const DEFAULT_ACQUIRE_TIMEOUT = 60000;
const timeouts = [
this.config.acquireConnectionTimeout,
poolConfig.acquireTimeoutMillis,
].filter((timeout) => timeout !== undefined);
if (!timeouts.length) {
timeouts.push(DEFAULT_ACQUIRE_TIMEOUT);
}
// acquire connection timeout can be set on config or config.pool
// choose the smallest, positive timeout setting and set on poolConfig
poolConfig.acquireTimeoutMillis = Math.min(...timeouts);
const updatePoolConnectionSettingsFromProvider = async () => {
if (!this.connectionConfigProvider) {
return; // static configuration, nothing to update
}
if (
!this.connectionConfigExpirationChecker ||
!this.connectionConfigExpirationChecker()
) {
return; // not expired, reuse existing connection
}
const providerResult = await this.connectionConfigProvider();
if (providerResult.expirationChecker) {
this.connectionConfigExpirationChecker =
providerResult.expirationChecker;
delete providerResult.expirationChecker; // MySQL2 driver warns on receiving extra properties
} else {
this.connectionConfigExpirationChecker = null;
}
this.connectionSettings = providerResult;
};
return Object.assign(poolConfig, {
create: async () => {
await updatePoolConnectionSettingsFromProvider();
const connection = await this.acquireRawConnection();
connection.__knexUid = uniqueId('__knexUid');
if (poolConfig.afterCreate) {
await promisify(poolConfig.afterCreate)(connection);
}
return connection;
},
destroy: (connection) => {
if (connection !== void 0) {
return this.destroyRawConnection(connection);
}
},
validate: (connection) => {
if (connection.__knex__disposed) {
this.logger.warn(`Connection Error: ${connection.__knex__disposed}`);
return false;
}
return this.validateConnection(connection);
},
});
}
initializePool(config = this.config) {
if (this.pool) {
this.logger.warn('The pool has already been initialized');
return;
}
const tarnPoolConfig = {
...this.getPoolSettings(config.pool),
};
// afterCreate is an internal knex param, tarn.js does not support it
if (tarnPoolConfig.afterCreate) {
delete tarnPoolConfig.afterCreate;
}
this.pool = new Pool(tarnPoolConfig);
}
validateConnection(connection) {
return true;
}
// Acquire a connection from the pool.
async acquireConnection() {
if (!this.pool) {
throw new Error('Unable to acquire a connection');
}
try {
const connection = await this.pool.acquire().promise;
debug('acquired connection from pool: %s', connection.__knexUid);
if (connection.config) {
if (connection.config.password) {
setHiddenProperty(connection.config);
}
if (
connection.config.authentication &&
connection.config.authentication.options &&
connection.config.authentication.options.password
) {
setHiddenProperty(connection.config.authentication.options);
}
}
return connection;
} catch (error) {
let convertedError = error;
if (error instanceof TimeoutError) {
convertedError = new KnexTimeoutError(
'Knex: Timeout acquiring a connection. The pool is probably full. ' +
'Are you missing a .transacting(trx) call?'
);
}
throw convertedError;
}
}
// Releases a connection back to the connection pool,
// returning a promise resolved when the connection is released.
releaseConnection(connection) {
debug('releasing connection to pool: %s', connection.__knexUid);
const didRelease = this.pool.release(connection);
if (!didRelease) {
debug('pool refused connection: %s', connection.__knexUid);
}
return Promise.resolve();
}
// Destroy the current connection pool for the client.
async destroy(callback) {
try {
if (this.pool && this.pool.destroy) {
await this.pool.destroy();
}
this.pool = undefined;
if (typeof callback === 'function') {
callback();
}
} catch (err) {
if (typeof callback === 'function') {
return callback(err);
}
throw err;
}
}
// Return the database being used by this client.
database() {
return this.connectionSettings.database;
}
toString() {
return '[object KnexClient]';
}
assertCanCancelQuery() {
if (!this.canCancelQuery) {
throw new Error('Query cancelling not supported for this dialect');
}
}
cancelQuery() {
throw new Error('Query cancelling not supported for this dialect');
}
// Formatter part
alias(first, second) {
return first + ' as ' + second;
}
// Checks whether a value is a function... if it is, we compile it
// otherwise we check whether it's a raw
parameter(value, builder, bindingsHolder) {
if (typeof value === 'function') {
return outputQuery(
compileCallback(value, undefined, this, bindingsHolder),
true,
builder,
this
);
}
return unwrapRaw(value, true, builder, this, bindingsHolder) || '?';
}
// Turns a list of values into a list of ?'s, joining them with commas unless
// a "joining" value is specified (e.g. ' and ')
parameterize(values, notSetValue, builder, bindingsHolder) {
if (typeof values === 'function')
return this.parameter(values, builder, bindingsHolder);
values = Array.isArray(values) ? values : [values];
let str = '',
i = -1;
while (++i < values.length) {
if (i > 0) str += ', ';
let value = values[i];
// json columns can have object in values.
if (isPlainObject(value)) {
value = JSON.stringify(value);
}
str += this.parameter(
value === undefined ? notSetValue : value,
builder,
bindingsHolder
);
}
return str;
}
// Formats `values` into a parenthesized list of parameters for a `VALUES`
// clause.
//
// [1, 2] -> '(?, ?)'
// [[1, 2], [3, 4]] -> '((?, ?), (?, ?))'
// knex('table') -> '(select * from "table")'
// knex.raw('select ?', 1) -> '(select ?)'
//
values(values, builder, bindingsHolder) {
if (Array.isArray(values)) {
if (Array.isArray(values[0])) {
return `(${values
.map(
(value) =>
`(${this.parameterize(
value,
undefined,
builder,
bindingsHolder
)})`
)
.join(', ')})`;
}
return `(${this.parameterize(
values,
undefined,
builder,
bindingsHolder
)})`;
}
if (values && values.isRawInstance) {
return `(${this.parameter(values, builder, bindingsHolder)})`;
}
return this.parameter(values, builder, bindingsHolder);
}
processPassedConnection(connection) {
// Default implementation is noop
}
toPathForJson(jsonPath) {
// By default, we want a json path, so if this function is not overriden,
// we return the path.
return jsonPath;
}
}
Object.assign(Client.prototype, {
_escapeBinding: makeEscape({
escapeString(str) {
return `'${str.replace(/'/g, "''")}'`;
},
}),
canCancelQuery: false,
});
module.exports = Client;

61
backend/apis/nodejs/node_modules/knex/lib/constants.js generated vendored Normal file
View File

@ -0,0 +1,61 @@
// The client names we'll allow in the `{name: lib}` pairing.
const CLIENT_ALIASES = Object.freeze({
pg: 'postgres',
postgresql: 'postgres',
sqlite: 'sqlite3',
});
const SUPPORTED_CLIENTS = Object.freeze(
[
'mssql',
'mysql',
'mysql2',
'oracledb',
'postgres',
'pgnative',
'redshift',
'sqlite3',
'cockroachdb',
'better-sqlite3',
].concat(Object.keys(CLIENT_ALIASES))
);
const DRIVER_NAMES = Object.freeze({
MsSQL: 'mssql',
MySQL: 'mysql',
MySQL2: 'mysql2',
Oracle: 'oracledb',
PostgreSQL: 'pg',
PgNative: 'pgnative',
Redshift: 'pg-redshift',
SQLite: 'sqlite3',
CockroachDB: 'cockroachdb',
BetterSQLite3: 'better-sqlite3',
});
const POOL_CONFIG_OPTIONS = Object.freeze([
'maxWaitingClients',
'testOnBorrow',
'fifo',
'priorityRange',
'autostart',
'evictionRunIntervalMillis',
'numTestsPerRun',
'softIdleTimeoutMillis',
'Promise',
]);
/**
* Regex that only matches comma's in strings that aren't wrapped in parentheses. Can be used to
* safely split strings like `id int, name string, body text, primary key (id, name)` into definition
* rows
*/
const COMMA_NO_PAREN_REGEX = /,[\s](?![^(]*\))/g;
module.exports = {
CLIENT_ALIASES,
SUPPORTED_CLIENTS,
POOL_CONFIG_OPTIONS,
COMMA_NO_PAREN_REGEX,
DRIVER_NAMES,
};

View File

@ -0,0 +1,77 @@
// better-sqlite3 Client
// -------
const Client_SQLite3 = require('../sqlite3');
class Client_BetterSQLite3 extends Client_SQLite3 {
_driver() {
return require('better-sqlite3');
}
// Get a raw connection from the database, returning a promise with the connection object.
async acquireRawConnection() {
const options = this.connectionSettings.options || {};
return new this.driver(this.connectionSettings.filename, {
nativeBinding: options.nativeBinding,
readonly: !!options.readonly,
});
}
// Used to explicitly close a connection, called internally by the pool when
// a connection times out or the pool is shutdown.
async destroyRawConnection(connection) {
return connection.close();
}
// Runs the query on the specified connection, providing the bindings and any
// other necessary prep work.
async _query(connection, obj) {
if (!obj.sql) throw new Error('The query is empty');
if (!connection) {
throw new Error('No connection provided');
}
const statement = connection.prepare(obj.sql);
const bindings = this._formatBindings(obj.bindings);
if (statement.reader) {
const response = await statement.all(bindings);
obj.response = response;
return obj;
}
const response = await statement.run(bindings);
obj.response = response;
obj.context = {
lastID: response.lastInsertRowid,
changes: response.changes,
};
return obj;
}
_formatBindings(bindings) {
if (!bindings) {
return [];
}
return bindings.map((binding) => {
if (binding instanceof Date) {
return binding.valueOf();
}
if (typeof binding === 'boolean') {
return Number(binding);
}
return binding;
});
}
}
Object.assign(Client_BetterSQLite3.prototype, {
// The "dialect", for reference .
driverName: 'better-sqlite3',
});
module.exports = Client_BetterSQLite3;

View File

@ -0,0 +1,14 @@
const ColumnCompiler_PG = require('../postgres/schema/pg-columncompiler.js');
class ColumnCompiler_CRDB extends ColumnCompiler_PG {
uuid(options = { primaryKey: false }) {
return (
'uuid' +
(this.tableCompiler._canBeAddPrimaryKey(options)
? ' primary key default gen_random_uuid()'
: '')
);
}
}
module.exports = ColumnCompiler_CRDB;

View File

@ -0,0 +1,11 @@
const QueryBuilder = require('../../query/querybuilder');
const isEmpty = require('lodash/isEmpty');
module.exports = class QueryBuilder_CockroachDB extends QueryBuilder {
upsert(values, returning, options) {
this._method = 'upsert';
if (!isEmpty(returning)) this.returning(returning, options);
this._single.upsert = values;
return this;
}
};

View File

@ -0,0 +1,122 @@
const QueryCompiler_PG = require('../postgres/query/pg-querycompiler');
const {
columnize: columnize_,
wrap: wrap_,
operator: operator_,
} = require('../../formatter/wrappingFormatter');
class QueryCompiler_CRDB extends QueryCompiler_PG {
truncate() {
return `truncate ${this.tableName}`;
}
upsert() {
let sql = this._upsert();
if (sql === '') return sql;
const { returning } = this.single;
if (returning) sql += this._returning(returning);
return {
sql: sql,
returning,
};
}
_upsert() {
const upsertValues = this.single.upsert || [];
const sql = this.with() + `upsert into ${this.tableName} `;
const body = this._insertBody(upsertValues);
return body === '' ? '' : sql + body;
}
_groupOrder(item, type) {
// CockroachDB don't support PostgreSQL order nulls first/last syntax, we take the generic one.
return this._basicGroupOrder(item, type);
}
whereJsonPath(statement) {
let castValue = '';
if (!isNaN(statement.value) && parseInt(statement.value)) {
castValue = '::int';
} else if (!isNaN(statement.value) && parseFloat(statement.value)) {
castValue = '::float';
} else {
castValue = " #>> '{}'";
}
return `json_extract_path(${this._columnClause(
statement
)}, ${this.client.toArrayPathFromJsonPath(
statement.jsonPath,
this.builder,
this.bindingsHolder
)})${castValue} ${operator_(
statement.operator,
this.builder,
this.client,
this.bindingsHolder
)} ${this._jsonValueClause(statement)}`;
}
// Json common functions
_jsonExtract(nameFunction, params) {
let extractions;
if (Array.isArray(params.column)) {
extractions = params.column;
} else {
extractions = [params];
}
return extractions
.map((extraction) => {
const jsonCol = `json_extract_path(${columnize_(
extraction.column || extraction[0],
this.builder,
this.client,
this.bindingsHolder
)}, ${this.client.toArrayPathFromJsonPath(
extraction.path || extraction[1],
this.builder,
this.bindingsHolder
)})`;
const alias = extraction.alias || extraction[2];
return alias
? this.client.alias(jsonCol, this.formatter.wrap(alias))
: jsonCol;
})
.join(', ');
}
_onJsonPathEquals(nameJoinFunction, clause) {
return (
'json_extract_path(' +
wrap_(
clause.columnFirst,
undefined,
this.builder,
this.client,
this.bindingsHolder
) +
', ' +
this.client.toArrayPathFromJsonPath(
clause.jsonPathFirst,
this.builder,
this.bindingsHolder
) +
') = json_extract_path(' +
wrap_(
clause.columnSecond,
undefined,
this.builder,
this.client,
this.bindingsHolder
) +
', ' +
this.client.toArrayPathFromJsonPath(
clause.jsonPathSecond,
this.builder,
this.bindingsHolder
) +
')'
);
}
}
module.exports = QueryCompiler_CRDB;

View File

@ -0,0 +1,37 @@
/* eslint max-len: 0 */
const TableCompiler = require('../postgres/schema/pg-tablecompiler');
class TableCompiler_CRDB extends TableCompiler {
constructor(client, tableBuilder) {
super(client, tableBuilder);
}
addColumns(columns, prefix, colCompilers) {
if (prefix === this.alterColumnsPrefix) {
// alter columns
for (const col of colCompilers) {
this.client.logger.warn(
'Experimental alter column in use, see issue: https://github.com/cockroachdb/cockroach/issues/49329'
);
this.pushQuery({
sql: 'SET enable_experimental_alter_column_type_general = true',
bindings: [],
});
super._addColumn(col);
}
} else {
// base class implementation for normal add
super.addColumns(columns, prefix);
}
}
dropUnique(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
this.pushQuery(`drop index ${this.tableName()}@${indexName} cascade `);
}
}
module.exports = TableCompiler_CRDB;

View File

@ -0,0 +1,15 @@
const ViewCompiler_PG = require('../postgres/schema/pg-viewcompiler.js');
class ViewCompiler_CRDB extends ViewCompiler_PG {
renameColumn(from, to) {
throw new Error('rename column of views is not supported by this dialect.');
}
defaultTo(column, defaultValue) {
throw new Error(
'change default values of views is not supported by this dialect.'
);
}
}
module.exports = ViewCompiler_CRDB;

View File

@ -0,0 +1,86 @@
// CockroachDB Client
// -------
const Client_PostgreSQL = require('../postgres');
const Transaction = require('../postgres/execution/pg-transaction');
const QueryCompiler = require('./crdb-querycompiler');
const ColumnCompiler = require('./crdb-columncompiler');
const TableCompiler = require('./crdb-tablecompiler');
const ViewCompiler = require('./crdb-viewcompiler');
const QueryBuilder = require('./crdb-querybuilder');
// Always initialize with the "QueryBuilder" and "QueryCompiler"
// objects, which extend the base 'lib/query/builder' and
// 'lib/query/compiler', respectively.
class Client_CockroachDB extends Client_PostgreSQL {
transaction() {
return new Transaction(this, ...arguments);
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
queryBuilder() {
return new QueryBuilder(this);
}
_parseVersion(versionString) {
return versionString.split(' ')[2];
}
async cancelQuery(connectionToKill) {
try {
return await this._wrappedCancelQueryCall(null, connectionToKill);
} catch (err) {
this.logger.warn(`Connection Error: ${err}`);
throw err;
}
}
_wrappedCancelQueryCall(emptyConnection, connectionToKill) {
// FixMe https://github.com/cockroachdb/cockroach/issues/41335
if (
connectionToKill.activeQuery.processID === 0 &&
connectionToKill.activeQuery.secretKey === 0
) {
return;
}
return connectionToKill.cancel(
connectionToKill,
connectionToKill.activeQuery
);
}
toArrayPathFromJsonPath(jsonPath, builder, bindingsHolder) {
return jsonPath
.replace(/^(\$\.)/, '') // remove the first dollar
.replace(/\[([0-9]+)]/, '.$1')
.split('.')
.map(
function (v) {
return this.parameter(v, builder, bindingsHolder);
}.bind(this)
)
.join(', ');
}
}
Object.assign(Client_CockroachDB.prototype, {
// The "dialect", for reference elsewhere.
driverName: 'cockroachdb',
});
module.exports = Client_CockroachDB;

View File

@ -0,0 +1,34 @@
'use strict';
Object.defineProperty(exports, '__esModule', { value: true });
exports.getDialectByNameOrAlias = void 0;
const { resolveClientNameWithAliases } = require('../util/helpers');
const dbNameToDialectLoader = Object.freeze({
'better-sqlite3': () => require('./better-sqlite3'),
cockroachdb: () => require('./cockroachdb'),
mssql: () => require('./mssql'),
mysql: () => require('./mysql'),
mysql2: () => require('./mysql2'),
oracle: () => require('./oracle'),
oracledb: () => require('./oracledb'),
pgnative: () => require('./pgnative'),
postgres: () => require('./postgres'),
redshift: () => require('./redshift'),
sqlite3: () => require('./sqlite3'),
});
/**
* Gets the Dialect object with the given client name or throw an
* error if not found.
*
* NOTE: This is a replacement for prior practice of doing dynamic
* string construction for imports of Dialect objects.
*/
function getDialectByNameOrAlias(clientName) {
const resolvedClientName = resolveClientNameWithAliases(clientName);
const dialectLoader = dbNameToDialectLoader[resolvedClientName];
if (!dialectLoader) {
throw new Error(`Invalid clientName given: ${clientName}`);
}
return dialectLoader();
}
exports.getDialectByNameOrAlias = getDialectByNameOrAlias;
//# sourceMappingURL=index.js.map

View File

@ -0,0 +1,500 @@
// MSSQL Client
// -------
const map = require('lodash/map');
const isNil = require('lodash/isNil');
const Client = require('../../client');
const MSSQL_Formatter = require('./mssql-formatter');
const Transaction = require('./transaction');
const QueryCompiler = require('./query/mssql-querycompiler');
const SchemaCompiler = require('./schema/mssql-compiler');
const TableCompiler = require('./schema/mssql-tablecompiler');
const ViewCompiler = require('./schema/mssql-viewcompiler');
const ColumnCompiler = require('./schema/mssql-columncompiler');
const QueryBuilder = require('../../query/querybuilder');
const { setHiddenProperty } = require('../../util/security');
const debug = require('debug')('knex:mssql');
const SQL_INT4 = { MIN: -2147483648, MAX: 2147483647 };
const SQL_BIGINT_SAFE = { MIN: -9007199254740991, MAX: 9007199254740991 };
// Always initialize with the "QueryBuilder" and "QueryCompiler" objects, which
// extend the base 'lib/query/builder' and 'lib/query/compiler', respectively.
class Client_MSSQL extends Client {
constructor(config = {}) {
super(config);
}
/**
* @param {import('knex').Config} options
*/
_generateConnection() {
const settings = this.connectionSettings;
settings.options = settings.options || {};
/** @type {import('tedious').ConnectionConfig} */
const cfg = {
authentication: {
type: settings.type || 'default',
options: {
userName: settings.userName || settings.user,
password: settings.password,
domain: settings.domain,
token: settings.token,
clientId: settings.clientId,
clientSecret: settings.clientSecret,
tenantId: settings.tenantId,
msiEndpoint: settings.msiEndpoint,
},
},
server: settings.server || settings.host,
options: {
database: settings.database,
encrypt: settings.encrypt || false,
port: settings.port || 1433,
connectTimeout: settings.connectionTimeout || settings.timeout || 15000,
requestTimeout: !isNil(settings.requestTimeout)
? settings.requestTimeout
: 15000,
rowCollectionOnDone: false,
rowCollectionOnRequestCompletion: false,
useColumnNames: false,
tdsVersion: settings.options.tdsVersion || '7_4',
appName: settings.options.appName || 'knex',
trustServerCertificate: false,
...settings.options,
},
};
if (cfg.authentication.options.password) {
setHiddenProperty(cfg.authentication.options);
}
// tedious always connect via tcp when port is specified
if (cfg.options.instanceName) delete cfg.options.port;
if (isNaN(cfg.options.requestTimeout)) cfg.options.requestTimeout = 15000;
if (cfg.options.requestTimeout === Infinity) cfg.options.requestTimeout = 0;
if (cfg.options.requestTimeout < 0) cfg.options.requestTimeout = 0;
if (settings.debug) {
cfg.options.debug = {
packet: true,
token: true,
data: true,
payload: true,
};
}
return cfg;
}
_driver() {
const tds = require('tedious');
return tds;
}
formatter() {
return new MSSQL_Formatter(this, ...arguments);
}
transaction() {
return new Transaction(this, ...arguments);
}
queryCompiler() {
return new QueryCompiler(this, ...arguments);
}
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
queryBuilder() {
const b = new QueryBuilder(this);
return b;
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
wrapIdentifierImpl(value) {
if (value === '*') {
return '*';
}
return `[${value.replace(/[[\]]+/g, '')}]`;
}
// Get a raw connection, called by the `pool` whenever a new
// connection needs to be added to the pool.
acquireRawConnection() {
return new Promise((resolver, rejecter) => {
debug('connection::connection new connection requested');
const Driver = this._driver();
const settings = Object.assign({}, this._generateConnection());
const connection = new Driver.Connection(settings);
connection.connect((err) => {
if (err) {
debug('connection::connect error: %s', err.message);
return rejecter(err);
}
debug('connection::connect connected to server');
connection.connected = true;
connection.on('error', (e) => {
debug('connection::error message=%s', e.message);
connection.__knex__disposed = e;
connection.connected = false;
});
connection.once('end', () => {
connection.connected = false;
connection.__knex__disposed = 'Connection to server was terminated.';
debug('connection::end connection ended.');
});
return resolver(connection);
});
});
}
validateConnection(connection) {
return connection && connection.connected;
}
// Used to explicitly close a connection, called internally by the pool
// when a connection times out or the pool is shutdown.
destroyRawConnection(connection) {
debug('connection::destroy');
return new Promise((resolve) => {
connection.once('end', () => {
resolve();
});
connection.close();
});
}
// Position the bindings for the query.
positionBindings(sql) {
let questionCount = -1;
return sql.replace(/\\?\?/g, (match) => {
if (match === '\\?') {
return '?';
}
questionCount += 1;
return `@p${questionCount}`;
});
}
_chomp(connection) {
if (connection.state.name === 'LoggedIn') {
const nextRequest = this.requestQueue.pop();
if (nextRequest) {
debug(
'connection::query executing query, %d more in queue',
this.requestQueue.length
);
connection.execSql(nextRequest);
}
}
}
_enqueueRequest(request, connection) {
this.requestQueue.push(request);
this._chomp(connection);
}
_makeRequest(query, callback) {
const Driver = this._driver();
const sql = typeof query === 'string' ? query : query.sql;
let rowCount = 0;
if (!sql) throw new Error('The query is empty');
debug('request::request sql=%s', sql);
const request = new Driver.Request(sql, (err, remoteRowCount) => {
if (err) {
debug('request::error message=%s', err.message);
return callback(err);
}
rowCount = remoteRowCount;
debug('request::callback rowCount=%d', rowCount);
});
request.on('prepared', () => {
debug('request %s::request prepared', this.id);
});
request.on('done', (rowCount, more) => {
debug('request::done rowCount=%d more=%s', rowCount, more);
});
request.on('doneProc', (rowCount, more) => {
debug(
'request::doneProc id=%s rowCount=%d more=%s',
request.id,
rowCount,
more
);
});
request.on('doneInProc', (rowCount, more) => {
debug(
'request::doneInProc id=%s rowCount=%d more=%s',
request.id,
rowCount,
more
);
});
request.once('requestCompleted', () => {
debug('request::completed id=%s', request.id);
return callback(null, rowCount);
});
request.on('error', (err) => {
debug('request::error id=%s message=%s', request.id, err.message);
return callback(err);
});
return request;
}
// Grab a connection, run the query via the MSSQL streaming interface,
// and pass that through to the stream we've sent back to the client.
_stream(connection, query, /** @type {NodeJS.ReadWriteStream} */ stream) {
return new Promise((resolve, reject) => {
const request = this._makeRequest(query, (err) => {
if (err) {
stream.emit('error', err);
return reject(err);
}
resolve();
});
request.on('row', (row) => {
stream.write(
row.reduce(
(prev, curr) => ({
...prev,
[curr.metadata.colName]: curr.value,
}),
{}
)
);
});
request.on('error', (err) => {
stream.emit('error', err);
reject(err);
});
request.once('requestCompleted', () => {
stream.end();
resolve();
});
this._assignBindings(request, query.bindings);
this._enqueueRequest(request, connection);
});
}
_assignBindings(request, bindings) {
if (Array.isArray(bindings)) {
for (let i = 0; i < bindings.length; i++) {
const binding = bindings[i];
this._setReqInput(request, i, binding);
}
}
}
_scaleForBinding(binding) {
if (binding % 1 === 0) {
throw new Error(`The binding value ${binding} must be a decimal number.`);
}
return { scale: 10 };
}
_typeForBinding(binding) {
const Driver = this._driver();
if (
this.connectionSettings.options &&
this.connectionSettings.options.mapBinding
) {
const result = this.connectionSettings.options.mapBinding(binding);
if (result) {
return [result.value, result.type];
}
}
switch (typeof binding) {
case 'string':
return [binding, Driver.TYPES.NVarChar];
case 'boolean':
return [binding, Driver.TYPES.Bit];
case 'number': {
if (binding % 1 !== 0) {
return [binding, Driver.TYPES.Float];
}
if (binding < SQL_INT4.MIN || binding > SQL_INT4.MAX) {
if (binding < SQL_BIGINT_SAFE.MIN || binding > SQL_BIGINT_SAFE.MAX) {
throw new Error(
`Bigint must be safe integer or must be passed as string, saw ${binding}`
);
}
return [binding, Driver.TYPES.BigInt];
}
return [binding, Driver.TYPES.Int];
}
default: {
if (binding instanceof Date) {
return [binding, Driver.TYPES.DateTime];
}
if (binding instanceof Buffer) {
return [binding, Driver.TYPES.VarBinary];
}
return [binding, Driver.TYPES.NVarChar];
}
}
}
// Runs the query on the specified connection, providing the bindings
// and any other necessary prep work.
_query(connection, query) {
return new Promise((resolve, reject) => {
const rows = [];
const request = this._makeRequest(query, (err, count) => {
if (err) {
return reject(err);
}
query.response = rows;
process.nextTick(() => this._chomp(connection));
resolve(query);
});
request.on('row', (row) => {
debug('request::row');
rows.push(row);
});
this._assignBindings(request, query.bindings);
this._enqueueRequest(request, connection);
});
}
// sets a request input parameter. Detects bigints and decimals and sets type appropriately.
_setReqInput(req, i, inputBinding) {
const [binding, tediousType] = this._typeForBinding(inputBinding);
const bindingName = 'p'.concat(i);
let options;
if (typeof binding === 'number' && binding % 1 !== 0) {
options = this._scaleForBinding(binding);
}
debug(
'request::binding pos=%d type=%s value=%s',
i,
tediousType.name,
binding
);
if (Buffer.isBuffer(binding)) {
options = {
length: 'max',
};
}
req.addParameter(bindingName, tediousType, binding, options);
}
// Process the response as returned from the query.
processResponse(query, runner) {
if (query == null) return;
let { response } = query;
const { method } = query;
if (query.output) {
return query.output.call(runner, response);
}
response = response.map((row) =>
row.reduce((columns, r) => {
const colName = r.metadata.colName;
if (columns[colName]) {
if (!Array.isArray(columns[colName])) {
columns[colName] = [columns[colName]];
}
columns[colName].push(r.value);
} else {
columns[colName] = r.value;
}
return columns;
}, {})
);
if (query.output) return query.output.call(runner, response);
switch (method) {
case 'select':
return response;
case 'first':
return response[0];
case 'pluck':
return map(response, query.pluck);
case 'insert':
case 'del':
case 'update':
case 'counter':
if (query.returning) {
if (query.returning === '@@rowcount') {
return response[0][''];
}
}
return response;
default:
return response;
}
}
}
Object.assign(Client_MSSQL.prototype, {
requestQueue: [],
dialect: 'mssql',
driverName: 'mssql',
});
module.exports = Client_MSSQL;

View File

@ -0,0 +1,34 @@
const Formatter = require('../../formatter');
class MSSQL_Formatter extends Formatter {
// Accepts a string or array of columns to wrap as appropriate.
columnizeWithPrefix(prefix, target) {
const columns = typeof target === 'string' ? [target] : target;
let str = '',
i = -1;
while (++i < columns.length) {
if (i > 0) str += ', ';
str += prefix + this.wrap(columns[i]);
}
return str;
}
/**
* Returns its argument with single quotes escaped, so it can be included into a single-quoted string.
*
* For example, it converts "has'quote" to "has''quote".
*
* This assumes QUOTED_IDENTIFIER ON so it is only ' that need escaping,
* never ", because " cannot be used to quote a string when that's on;
* otherwise we'd need to be aware of whether the string is quoted with " or '.
*
* This assumption is consistent with the SQL Knex generates.
* @param {string} string
* @returns {string}
*/
escapingStringDelimiters(string) {
return (string || '').replace(/'/g, "''");
}
}
module.exports = MSSQL_Formatter;

View File

@ -0,0 +1,601 @@
// MSSQL Query Compiler
// ------
const QueryCompiler = require('../../../query/querycompiler');
const compact = require('lodash/compact');
const identity = require('lodash/identity');
const isEmpty = require('lodash/isEmpty');
const Raw = require('../../../raw.js');
const {
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
const components = [
'comments',
'columns',
'join',
'lock',
'where',
'union',
'group',
'having',
'order',
'limit',
'offset',
];
class QueryCompiler_MSSQL extends QueryCompiler {
constructor(client, builder, formatter) {
super(client, builder, formatter);
const { onConflict } = this.single;
if (onConflict) {
throw new Error('.onConflict() is not supported for mssql.');
}
this._emptyInsertValue = 'default values';
}
with() {
// WITH RECURSIVE is a syntax error:
// SQL Server does not syntactically distinguish recursive and non-recursive CTEs.
// So mark all statements as non-recursive, generate the SQL, then restore.
// This approach ensures any changes in base class with() get propagated here.
const undoList = [];
if (this.grouped.with) {
for (const stmt of this.grouped.with) {
if (stmt.recursive) {
undoList.push(stmt);
stmt.recursive = false;
}
}
}
const result = super.with();
// Restore the recursive markings, in case this same query gets cloned and passed to other drivers.
for (const stmt of undoList) {
stmt.recursive = true;
}
return result;
}
select() {
const sql = this.with();
const statements = components.map((component) => this[component](this));
return sql + compact(statements).join(' ');
}
//#region Insert
// Compiles an "insert" query, allowing for multiple
// inserts using a single query statement.
insert() {
if (
this.single.options &&
this.single.options.includeTriggerModifications
) {
return this.insertWithTriggers();
} else {
return this.standardInsert();
}
}
insertWithTriggers() {
const insertValues = this.single.insert || [];
const { returning } = this.single;
let sql =
this.with() +
`${this._buildTempTable(returning)}insert into ${this.tableName} `;
const returningSql = returning
? this._returning('insert', returning, true) + ' '
: '';
if (Array.isArray(insertValues)) {
if (insertValues.length === 0) {
return '';
}
} else if (typeof insertValues === 'object' && isEmpty(insertValues)) {
return {
sql:
sql +
returningSql +
this._emptyInsertValue +
this._buildReturningSelect(returning),
returning,
};
}
sql += this._buildInsertData(insertValues, returningSql);
if (returning) {
sql += this._buildReturningSelect(returning);
}
return {
sql,
returning,
};
}
_buildInsertData(insertValues, returningSql) {
let sql = '';
const insertData = this._prepInsert(insertValues);
if (typeof insertData === 'string') {
sql += insertData;
} else {
if (insertData.columns.length) {
sql += `(${this.formatter.columnize(insertData.columns)}`;
sql +=
`) ${returningSql}values (` +
this._buildInsertValues(insertData) +
')';
} else if (insertValues.length === 1 && insertValues[0]) {
sql += returningSql + this._emptyInsertValue;
} else {
return '';
}
}
return sql;
}
standardInsert() {
const insertValues = this.single.insert || [];
let sql = this.with() + `insert into ${this.tableName} `;
const { returning } = this.single;
const returningSql = returning
? this._returning('insert', returning) + ' '
: '';
if (Array.isArray(insertValues)) {
if (insertValues.length === 0) {
return '';
}
} else if (typeof insertValues === 'object' && isEmpty(insertValues)) {
return {
sql: sql + returningSql + this._emptyInsertValue,
returning,
};
}
sql += this._buildInsertData(insertValues, returningSql);
return {
sql,
returning,
};
}
//#endregion
//#region Update
// Compiles an `update` query, allowing for a return value.
update() {
if (
this.single.options &&
this.single.options.includeTriggerModifications
) {
return this.updateWithTriggers();
} else {
return this.standardUpdate();
}
}
updateWithTriggers() {
const top = this.top();
const withSQL = this.with();
const updates = this._prepUpdate(this.single.update);
const join = this.join();
const where = this.where();
const order = this.order();
const { returning } = this.single;
const declaredTemp = this._buildTempTable(returning);
return {
sql:
withSQL +
declaredTemp +
`update ${top ? top + ' ' : ''}${this.tableName}` +
' set ' +
updates.join(', ') +
(returning ? ` ${this._returning('update', returning, true)}` : '') +
(join ? ` from ${this.tableName} ${join}` : '') +
(where ? ` ${where}` : '') +
(order ? ` ${order}` : '') +
(!returning
? this._returning('rowcount', '@@rowcount')
: this._buildReturningSelect(returning)),
returning: returning || '@@rowcount',
};
}
_formatGroupsItemValue(value, nulls) {
const column = super._formatGroupsItemValue(value);
// MSSQL dont support 'is null' syntax in order by,
// so we override this function and add MSSQL specific syntax.
if (nulls && !(value instanceof Raw)) {
const collNulls = `IIF(${column} is null,`;
if (nulls === 'first') {
return `${collNulls}0,1)`;
} else if (nulls === 'last') {
return `${collNulls}1,0)`;
}
}
return column;
}
standardUpdate() {
const top = this.top();
const withSQL = this.with();
const updates = this._prepUpdate(this.single.update);
const join = this.join();
const where = this.where();
const order = this.order();
const { returning } = this.single;
return {
sql:
withSQL +
`update ${top ? top + ' ' : ''}${this.tableName}` +
' set ' +
updates.join(', ') +
(returning ? ` ${this._returning('update', returning)}` : '') +
(join ? ` from ${this.tableName} ${join}` : '') +
(where ? ` ${where}` : '') +
(order ? ` ${order}` : '') +
(!returning ? this._returning('rowcount', '@@rowcount') : ''),
returning: returning || '@@rowcount',
};
}
//#endregion
//#region Delete
// Compiles a `delete` query.
del() {
if (
this.single.options &&
this.single.options.includeTriggerModifications
) {
return this.deleteWithTriggers();
} else {
return this.standardDelete();
}
}
deleteWithTriggers() {
// Make sure tableName is processed by the formatter first.
const withSQL = this.with();
const { tableName } = this;
const wheres = this.where();
const joins = this.join();
const { returning } = this.single;
const returningStr = returning
? ` ${this._returning('del', returning, true)}`
: '';
const deleteSelector = joins ? `${tableName}${returningStr} ` : '';
return {
sql:
withSQL +
`${this._buildTempTable(
returning
)}delete ${deleteSelector}from ${tableName}` +
(!joins ? returningStr : '') +
(joins ? ` ${joins}` : '') +
(wheres ? ` ${wheres}` : '') +
(!returning
? this._returning('rowcount', '@@rowcount')
: this._buildReturningSelect(returning)),
returning: returning || '@@rowcount',
};
}
standardDelete() {
// Make sure tableName is processed by the formatter first.
const withSQL = this.with();
const { tableName } = this;
const wheres = this.where();
const joins = this.join();
const { returning } = this.single;
const returningStr = returning
? ` ${this._returning('del', returning)}`
: '';
// returning needs to be before "from" when using join
const deleteSelector = joins ? `${tableName}${returningStr} ` : '';
return {
sql:
withSQL +
`delete ${deleteSelector}from ${tableName}` +
(!joins ? returningStr : '') +
(joins ? ` ${joins}` : '') +
(wheres ? ` ${wheres}` : '') +
(!returning ? this._returning('rowcount', '@@rowcount') : ''),
returning: returning || '@@rowcount',
};
}
//#endregion
// Compiles the columns in the query, specifying if an item was distinct.
columns() {
let distinctClause = '';
if (this.onlyUnions()) return '';
const top = this.top();
const hints = this._hintComments();
const columns = this.grouped.columns || [];
let i = -1,
sql = [];
if (columns) {
while (++i < columns.length) {
const stmt = columns[i];
if (stmt.distinct) distinctClause = 'distinct ';
if (stmt.distinctOn) {
distinctClause = this.distinctOn(stmt.value);
continue;
}
if (stmt.type === 'aggregate') {
sql.push(...this.aggregate(stmt));
} else if (stmt.type === 'aggregateRaw') {
sql.push(this.aggregateRaw(stmt));
} else if (stmt.type === 'analytic') {
sql.push(this.analytic(stmt));
} else if (stmt.type === 'json') {
sql.push(this.json(stmt));
} else if (stmt.value && stmt.value.length > 0) {
sql.push(this.formatter.columnize(stmt.value));
}
}
}
if (sql.length === 0) sql = ['*'];
const select = this.onlyJson() ? '' : 'select ';
return (
`${select}${hints}${distinctClause}` +
(top ? top + ' ' : '') +
sql.join(', ') +
(this.tableName ? ` from ${this.tableName}` : '')
);
}
_returning(method, value, withTrigger) {
switch (method) {
case 'update':
case 'insert':
return value
? `output ${this.formatter.columnizeWithPrefix('inserted.', value)}${
withTrigger ? ' into #out' : ''
}`
: '';
case 'del':
return value
? `output ${this.formatter.columnizeWithPrefix('deleted.', value)}${
withTrigger ? ' into #out' : ''
}`
: '';
case 'rowcount':
return value ? ';select @@rowcount' : '';
}
}
_buildTempTable(values) {
// If value is nothing then return an empty string
if (values && values.length > 0) {
let selections = '';
// Build values that will be returned from this procedure
if (Array.isArray(values)) {
selections = values
.map((value) => `[t].${this.formatter.columnize(value)}`)
.join(',');
} else {
selections = `[t].${this.formatter.columnize(values)}`;
}
// Force #out to be correctly populated with the correct column structure.
let sql = `select top(0) ${selections} into #out `;
sql += `from ${this.tableName} as t `;
sql += `left join ${this.tableName} on 0=1;`;
return sql;
}
return '';
}
_buildReturningSelect(values) {
// If value is nothing then return an empty string
if (values && values.length > 0) {
let selections = '';
// Build columns to return
if (Array.isArray(values)) {
selections = values
.map((value) => `${this.formatter.columnize(value)}`)
.join(',');
} else {
selections = this.formatter.columnize(values);
}
// Get the returned values
let sql = `; select ${selections} from #out; `;
// Drop the temp table to prevent memory leaks
sql += `drop table #out;`;
return sql;
}
return '';
}
// Compiles a `truncate` query.
truncate() {
return `truncate table ${this.tableName}`;
}
forUpdate() {
// this doesn't work exacltly as it should, one should also mention index while locking
// https://stackoverflow.com/a/9818448/360060
return 'with (UPDLOCK)';
}
forShare() {
// http://www.sqlteam.com/article/introduction-to-locking-in-sql-server
return 'with (HOLDLOCK)';
}
// Compiles a `columnInfo` query.
columnInfo() {
const column = this.single.columnInfo;
let schema = this.single.schema;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
if (schema) {
schema = this.client.customWrapIdentifier(schema, identity);
}
// GOTCHA: INFORMATION_SCHEMA.COLUMNS must be capitalized to work when the database has a case-sensitive collation. [#4573]
let sql = `select [COLUMN_NAME], [COLUMN_DEFAULT], [DATA_TYPE], [CHARACTER_MAXIMUM_LENGTH], [IS_NULLABLE] from INFORMATION_SCHEMA.COLUMNS where table_name = ? and table_catalog = ?`;
const bindings = [table, this.client.database()];
if (schema) {
sql += ' and table_schema = ?';
bindings.push(schema);
} else {
sql += ` and table_schema = 'dbo'`;
}
return {
sql,
bindings: bindings,
output(resp) {
const out = resp.reduce((columns, val) => {
columns[val[0].value] = {
defaultValue: val[1].value,
type: val[2].value,
maxLength: val[3].value,
nullable: val[4].value === 'YES',
};
return columns;
}, {});
return (column && out[column]) || out;
},
};
}
top() {
const noLimit = !this.single.limit && this.single.limit !== 0;
const noOffset = !this.single.offset;
if (noLimit || !noOffset) return '';
return `top (${this._getValueOrParameterFromAttribute('limit')})`;
}
limit() {
return '';
}
offset() {
const noLimit = !this.single.limit && this.single.limit !== 0;
const noOffset = !this.single.offset;
if (noOffset) return '';
let offset = `offset ${
noOffset ? '0' : this._getValueOrParameterFromAttribute('offset')
} rows`;
if (!noLimit) {
offset += ` fetch next ${this._getValueOrParameterFromAttribute(
'limit'
)} rows only`;
}
return offset;
}
whereLike(statement) {
return `${this._columnClause(
statement
)} collate SQL_Latin1_General_CP1_CS_AS ${this._not(
statement,
'like '
)}${this._valueClause(statement)}`;
}
whereILike(statement) {
return `${this._columnClause(
statement
)} collate SQL_Latin1_General_CP1_CI_AS ${this._not(
statement,
'like '
)}${this._valueClause(statement)}`;
}
jsonExtract(params) {
// JSON_VALUE return NULL if we query object or array
// JSON_QUERY return NULL if we query literal/single value
return this._jsonExtract(
params.singleValue ? 'JSON_VALUE' : 'JSON_QUERY',
params
);
}
jsonSet(params) {
return this._jsonSet('JSON_MODIFY', params);
}
jsonInsert(params) {
return this._jsonSet('JSON_MODIFY', params);
}
jsonRemove(params) {
const jsonCol = `JSON_MODIFY(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)},${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)}, NULL)`;
return params.alias
? this.client.alias(jsonCol, this.formatter.wrap(params.alias))
: jsonCol;
}
whereJsonPath(statement) {
return this._whereJsonPath('JSON_VALUE', statement);
}
whereJsonSupersetOf(statement) {
throw new Error(
'Json superset where clause not actually supported by MSSQL'
);
}
whereJsonSubsetOf(statement) {
throw new Error('Json subset where clause not actually supported by MSSQL');
}
_getExtracts(statement, operator) {
const column = columnize_(
statement.column,
this.builder,
this.client,
this.bindingsHolder
);
return (
Array.isArray(statement.values) ? statement.values : [statement.values]
)
.map(function (value) {
return (
'JSON_VALUE(' +
column +
',' +
this.client.parameter(value, this.builder, this.bindingsHolder) +
')'
);
}, this)
.join(operator);
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('JSON_VALUE', clause);
}
}
// Set the QueryBuilder & QueryCompiler on the client object,
// in case anyone wants to modify things to suit their own purposes.
module.exports = QueryCompiler_MSSQL;

View File

@ -0,0 +1,185 @@
// MSSQL Column Compiler
// -------
const ColumnCompiler = require('../../../schema/columncompiler');
const { toNumber } = require('../../../util/helpers');
const { formatDefault } = require('../../../formatter/formatterUtils');
const { operator: operator_ } = require('../../../formatter/wrappingFormatter');
class ColumnCompiler_MSSQL extends ColumnCompiler {
constructor(client, tableCompiler, columnBuilder) {
super(client, tableCompiler, columnBuilder);
this.modifiers = ['nullable', 'defaultTo', 'first', 'after', 'comment'];
this._addCheckModifiers();
}
// Types
// ------
double(precision, scale) {
return 'float';
}
floating(precision, scale) {
// ignore precicion / scale which is mysql specific stuff
return `float`;
}
integer() {
// mssql does not support length
return 'int';
}
tinyint() {
// mssql does not support length
return 'tinyint';
}
varchar(length) {
return `nvarchar(${toNumber(length, 255)})`;
}
timestamp({ useTz = false } = {}) {
return useTz ? 'datetimeoffset' : 'datetime2';
}
bit(length) {
if (length > 1) {
this.client.logger.warn('Bit field is exactly 1 bit length for MSSQL');
}
return 'bit';
}
binary(length) {
return length ? `varbinary(${toNumber(length)})` : 'varbinary(max)';
}
// Modifiers
// ------
first() {
this.client.logger.warn('Column first modifier not available for MSSQL');
return '';
}
after(column) {
this.client.logger.warn('Column after modifier not available for MSSQL');
return '';
}
defaultTo(value, { constraintName } = {}) {
const formattedValue = formatDefault(value, this.type, this.client);
constraintName =
typeof constraintName !== 'undefined'
? constraintName
: `${
this.tableCompiler.tableNameRaw
}_${this.getColumnName()}_default`.toLowerCase();
if (this.columnBuilder._method === 'alter') {
this.pushAdditional(function () {
this.pushQuery(
`ALTER TABLE ${this.tableCompiler.tableName()} ADD CONSTRAINT ${this.formatter.wrap(
constraintName
)} DEFAULT ${formattedValue} FOR ${this.formatter.wrap(
this.getColumnName()
)}`
);
});
return '';
}
if (!constraintName) {
return `DEFAULT ${formattedValue}`;
}
return `CONSTRAINT ${this.formatter.wrap(
constraintName
)} DEFAULT ${formattedValue}`;
}
comment(/** @type {string} */ comment) {
if (!comment) {
return;
}
// XXX: This is a byte limit, not character, so we cannot definitively say they'll exceed the limit without database collation info.
// (Yes, even if the column has its own collation, the sqlvariant still uses the database collation.)
// I'm not sure we even need to raise a warning, as MSSQL will return an error when the limit is exceeded itself.
if (comment && comment.length > 7500 / 2) {
this.client.logger.warn(
'Your comment might be longer than the max comment length for MSSQL of 7,500 bytes.'
);
}
// See: https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-addextendedproperty-transact-sql?view=sql-server-ver15#b-adding-an-extended-property-to-a-column-in-a-table
const value = this.formatter.escapingStringDelimiters(comment);
const level0name = this.tableCompiler.schemaNameRaw || 'dbo';
const level1name = this.formatter.escapingStringDelimiters(
this.tableCompiler.tableNameRaw
);
const level2name = this.formatter.escapingStringDelimiters(
this.args[0] || this.defaults('columnName')
);
const args = `N'MS_Description', N'${value}', N'Schema', N'${level0name}', N'Table', N'${level1name}', N'Column', N'${level2name}'`;
this.pushAdditional(function () {
const isAlreadyDefined = `EXISTS(SELECT * FROM sys.fn_listextendedproperty(N'MS_Description', N'Schema', N'${level0name}', N'Table', N'${level1name}', N'Column', N'${level2name}'))`;
this.pushQuery(
`IF ${isAlreadyDefined}\n EXEC sys.sp_updateextendedproperty ${args}\nELSE\n EXEC sys.sp_addextendedproperty ${args}`
);
});
return '';
}
checkLength(operator, length, constraintName) {
return this._check(
`LEN(${this.formatter.wrap(this.getColumnName())}) ${operator_(
operator,
this.columnBuilder,
this.bindingsHolder
)} ${toNumber(length)}`,
constraintName
);
}
checkRegex(regex, constraintName) {
return this._check(
`${this.formatter.wrap(
this.getColumnName()
)} LIKE ${this.client._escapeBinding('%' + regex + '%')}`,
constraintName
);
}
increments(options = { primaryKey: true }) {
return (
'int identity(1,1) not null' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
bigincrements(options = { primaryKey: true }) {
return (
'bigint identity(1,1) not null' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
}
ColumnCompiler_MSSQL.prototype.bigint = 'bigint';
ColumnCompiler_MSSQL.prototype.mediumint = 'int';
ColumnCompiler_MSSQL.prototype.smallint = 'smallint';
ColumnCompiler_MSSQL.prototype.text = 'nvarchar(max)';
ColumnCompiler_MSSQL.prototype.mediumtext = 'nvarchar(max)';
ColumnCompiler_MSSQL.prototype.longtext = 'nvarchar(max)';
ColumnCompiler_MSSQL.prototype.json = ColumnCompiler_MSSQL.prototype.jsonb =
'nvarchar(max)';
// TODO: mssql supports check constraints as of SQL Server 2008
// so make enu here more like postgres
ColumnCompiler_MSSQL.prototype.enu = 'nvarchar(100)';
ColumnCompiler_MSSQL.prototype.uuid = ({ useBinaryUuid = false } = {}) =>
useBinaryUuid ? 'binary(16)' : 'uniqueidentifier';
ColumnCompiler_MSSQL.prototype.datetime = 'datetime2';
ColumnCompiler_MSSQL.prototype.bool = 'bit';
module.exports = ColumnCompiler_MSSQL;

View File

@ -0,0 +1,91 @@
// MySQL Schema Compiler
// -------
const SchemaCompiler = require('../../../schema/compiler');
class SchemaCompiler_MSSQL extends SchemaCompiler {
constructor(client, builder) {
super(client, builder);
}
dropTableIfExists(tableName) {
const name = this.formatter.wrap(prefixedTableName(this.schema, tableName));
this.pushQuery(
`if object_id('${name}', 'U') is not null DROP TABLE ${name}`
);
}
dropViewIfExists(viewName) {
const name = this.formatter.wrap(prefixedTableName(this.schema, viewName));
this.pushQuery(
`if object_id('${name}', 'V') is not null DROP VIEW ${name}`
);
}
// Rename a table on the schema.
renameTable(tableName, to) {
this.pushQuery(
`exec sp_rename ${this.client.parameter(
prefixedTableName(this.schema, tableName),
this.builder,
this.bindingsHolder
)}, ${this.client.parameter(to, this.builder, this.bindingsHolder)}`
);
}
renameView(viewTable, to) {
this.pushQuery(
`exec sp_rename ${this.client.parameter(
prefixedTableName(this.schema, viewTable),
this.builder,
this.bindingsHolder
)}, ${this.client.parameter(to, this.builder, this.bindingsHolder)}`
);
}
// Check whether a table exists on the query.
hasTable(tableName) {
const formattedTable = this.client.parameter(
prefixedTableName(this.schema, tableName),
this.builder,
this.bindingsHolder
);
const bindings = [tableName];
let sql =
`SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES ` +
`WHERE TABLE_NAME = ${formattedTable}`;
if (this.schema) {
sql += ' AND TABLE_SCHEMA = ?';
bindings.push(this.schema);
}
this.pushQuery({ sql, bindings, output: (resp) => resp.length > 0 });
}
// Check whether a column exists on the schema.
hasColumn(tableName, column) {
const formattedColumn = this.client.parameter(
column,
this.builder,
this.bindingsHolder
);
const formattedTable = this.client.parameter(
this.formatter.wrap(prefixedTableName(this.schema, tableName)),
this.builder,
this.bindingsHolder
);
const sql =
`select object_id from sys.columns ` +
`where name = ${formattedColumn} ` +
`and object_id = object_id(${formattedTable})`;
this.pushQuery({ sql, output: (resp) => resp.length > 0 });
}
}
SchemaCompiler_MSSQL.prototype.dropTablePrefix = 'DROP TABLE ';
function prefixedTableName(prefix, table) {
return prefix ? `${prefix}.${table}` : table;
}
module.exports = SchemaCompiler_MSSQL;

View File

@ -0,0 +1,378 @@
/* eslint max-len:0 */
// MSSQL Table Builder & Compiler
// -------
const TableCompiler = require('../../../schema/tablecompiler');
const helpers = require('../../../util/helpers');
const { isObject } = require('../../../util/is');
// Table Compiler
// ------
class TableCompiler_MSSQL extends TableCompiler {
constructor(client, tableBuilder) {
super(client, tableBuilder);
}
createQuery(columns, ifNot, like) {
let createStatement = ifNot
? `if object_id('${this.tableName()}', 'U') is null `
: '';
if (like) {
// This query copy only columns and not all indexes and keys like other databases.
createStatement += `SELECT * INTO ${this.tableName()} FROM ${this.tableNameLike()} WHERE 0=1`;
} else {
createStatement +=
'CREATE TABLE ' +
this.tableName() +
(this._formatting ? ' (\n ' : ' (') +
columns.sql.join(this._formatting ? ',\n ' : ', ') +
this._addChecks() +
')';
}
this.pushQuery(createStatement);
if (this.single.comment) {
this.comment(this.single.comment);
}
if (like) {
this.addColumns(columns, this.addColumnsPrefix);
}
}
comment(/** @type {string} */ comment) {
if (!comment) {
return;
}
// XXX: This is a byte limit, not character, so we cannot definitively say they'll exceed the limit without server collation info.
// When I checked in SQL Server 2019, the ctext column in sys.syscomments is defined as a varbinary(8000), so it doesn't even have its own defined collation.
if (comment.length > 7500 / 2) {
this.client.logger.warn(
'Your comment might be longer than the max comment length for MSSQL of 7,500 bytes.'
);
}
// See: https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-addextendedproperty-transact-sql?view=sql-server-ver15#f-adding-an-extended-property-to-a-table
const value = this.formatter.escapingStringDelimiters(comment);
const level0name = this.formatter.escapingStringDelimiters(
this.schemaNameRaw || 'dbo'
);
const level1name = this.formatter.escapingStringDelimiters(
this.tableNameRaw
);
const args = `N'MS_Description', N'${value}', N'Schema', N'${level0name}', N'Table', N'${level1name}'`;
const isAlreadyDefined = `EXISTS(SELECT * FROM sys.fn_listextendedproperty(N'MS_Description', N'Schema', N'${level0name}', N'Table', N'${level1name}', NULL, NULL))`;
this.pushQuery(
`IF ${isAlreadyDefined}\n EXEC sys.sp_updateextendedproperty ${args}\nELSE\n EXEC sys.sp_addextendedproperty ${args}`
);
}
// Compiles column add. Multiple columns need only one ADD clause (not one ADD per column) so core addColumns doesn't work. #1348
addColumns(columns, prefix) {
prefix = prefix || this.addColumnsPrefix;
if (columns.sql.length > 0) {
this.pushQuery({
sql:
(this.lowerCase ? 'alter table ' : 'ALTER TABLE ') +
this.tableName() +
' ' +
prefix +
columns.sql.join(', '),
bindings: columns.bindings,
});
}
}
alterColumns(columns, colBuilder) {
for (let i = 0, l = colBuilder.length; i < l; i++) {
const builder = colBuilder[i];
if (builder.modified.defaultTo) {
const schema = this.schemaNameRaw || 'dbo';
const baseQuery = `
DECLARE @constraint varchar(100) = (SELECT default_constraints.name
FROM sys.all_columns
INNER JOIN sys.tables
ON all_columns.object_id = tables.object_id
INNER JOIN sys.schemas
ON tables.schema_id = schemas.schema_id
INNER JOIN sys.default_constraints
ON all_columns.default_object_id = default_constraints.object_id
WHERE schemas.name = '${schema}'
AND tables.name = '${
this.tableNameRaw
}'
AND all_columns.name = '${builder.getColumnName()}')
IF @constraint IS NOT NULL EXEC('ALTER TABLE ${
this.tableNameRaw
} DROP CONSTRAINT ' + @constraint)`;
this.pushQuery(baseQuery);
}
}
// in SQL server only one column can be altered at a time
columns.sql.forEach((sql) => {
this.pushQuery({
sql:
(this.lowerCase ? 'alter table ' : 'ALTER TABLE ') +
this.tableName() +
' ' +
(this.lowerCase
? this.alterColumnPrefix.toLowerCase()
: this.alterColumnPrefix) +
sql,
bindings: columns.bindings,
});
});
}
// Compiles column drop. Multiple columns need only one DROP clause (not one DROP per column) so core dropColumn doesn't work. #1348
dropColumn() {
const _this2 = this;
const columns = helpers.normalizeArr.apply(null, arguments);
const columnsArray = Array.isArray(columns) ? columns : [columns];
const drops = columnsArray.map((column) => _this2.formatter.wrap(column));
const schema = this.schemaNameRaw || 'dbo';
for (const column of columns) {
const baseQuery = `
DECLARE @constraint varchar(100) = (SELECT default_constraints.name
FROM sys.all_columns
INNER JOIN sys.tables
ON all_columns.object_id = tables.object_id
INNER JOIN sys.schemas
ON tables.schema_id = schemas.schema_id
INNER JOIN sys.default_constraints
ON all_columns.default_object_id = default_constraints.object_id
WHERE schemas.name = '${schema}'
AND tables.name = '${this.tableNameRaw}'
AND all_columns.name = '${column}')
IF @constraint IS NOT NULL EXEC('ALTER TABLE ${this.tableNameRaw} DROP CONSTRAINT ' + @constraint)`;
this.pushQuery(baseQuery);
}
this.pushQuery(
(this.lowerCase ? 'alter table ' : 'ALTER TABLE ') +
this.tableName() +
' ' +
this.dropColumnPrefix +
drops.join(', ')
);
}
changeType() {}
// Renames a column on the table.
renameColumn(from, to) {
this.pushQuery(
`exec sp_rename ${this.client.parameter(
this.tableName() + '.' + from,
this.tableBuilder,
this.bindingsHolder
)}, ${this.client.parameter(
to,
this.tableBuilder,
this.bindingsHolder
)}, 'COLUMN'`
);
}
dropFKRefs(runner, refs) {
const formatter = this.client.formatter(this.tableBuilder);
return Promise.all(
refs.map(function (ref) {
const constraintName = formatter.wrap(ref.CONSTRAINT_NAME);
const tableName = formatter.wrap(ref.TABLE_NAME);
return runner.query({
sql: `ALTER TABLE ${tableName} DROP CONSTRAINT ${constraintName}`,
});
})
);
}
createFKRefs(runner, refs) {
const formatter = this.client.formatter(this.tableBuilder);
return Promise.all(
refs.map(function (ref) {
const tableName = formatter.wrap(ref.TABLE_NAME);
const keyName = formatter.wrap(ref.CONSTRAINT_NAME);
const column = formatter.columnize(ref.COLUMN_NAME);
const references = formatter.columnize(ref.REFERENCED_COLUMN_NAME);
const inTable = formatter.wrap(ref.REFERENCED_TABLE_NAME);
const onUpdate = ` ON UPDATE ${ref.UPDATE_RULE}`;
const onDelete = ` ON DELETE ${ref.DELETE_RULE}`;
return runner.query({
sql:
`ALTER TABLE ${tableName} ADD CONSTRAINT ${keyName}` +
' FOREIGN KEY (' +
column +
') REFERENCES ' +
inTable +
' (' +
references +
')' +
onUpdate +
onDelete,
});
})
);
}
index(columns, indexName, options) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
let predicate;
if (isObject(options)) {
({ predicate } = options);
}
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: '';
this.pushQuery(
`CREATE INDEX ${indexName} ON ${this.tableName()} (${this.formatter.columnize(
columns
)})${predicateQuery}`
);
}
/**
* Create a primary key.
*
* @param {undefined | string | string[]} columns
* @param {string | {constraintName: string, deferrable?: 'not deferrable'|'deferred'|'immediate' }} constraintName
*/
primary(columns, constraintName) {
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`mssql: primary key constraint [${constraintName}] will not be deferrable ${deferrable} because mssql does not support deferred constraints.`
);
}
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
if (!this.forCreate) {
this.pushQuery(
`ALTER TABLE ${this.tableName()} ADD CONSTRAINT ${constraintName} PRIMARY KEY (${this.formatter.columnize(
columns
)})`
);
} else {
this.pushQuery(
`CONSTRAINT ${constraintName} PRIMARY KEY (${this.formatter.columnize(
columns
)})`
);
}
}
/**
* Create a unique index.
*
* @param {string | string[]} columns
* @param {string | {indexName: undefined | string, deferrable?: 'not deferrable'|'deferred'|'immediate', useConstraint?: true|false, predicate?: QueryBuilder }} indexName
*/
unique(columns, indexName) {
/** @type {string | undefined} */
let deferrable;
let useConstraint = false;
let predicate;
if (isObject(indexName)) {
({ indexName, deferrable, useConstraint, predicate } = indexName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`mssql: unique index [${indexName}] will not be deferrable ${deferrable} because mssql does not support deferred constraints.`
);
}
if (useConstraint && predicate) {
throw new Error('mssql cannot create constraint with predicate');
}
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
if (!Array.isArray(columns)) {
columns = [columns];
}
if (useConstraint) {
// mssql supports unique indexes and unique constraints.
// unique indexes cannot be used with foreign key relationships hence unique constraints are used instead.
this.pushQuery(
`ALTER TABLE ${this.tableName()} ADD CONSTRAINT ${indexName} UNIQUE (${this.formatter.columnize(
columns
)})`
);
} else {
// default to making unique index that allows null https://stackoverflow.com/a/767702/360060
// to be more or less compatible with other DBs (if any of the columns is NULL then "duplicates" are allowed)
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: ' WHERE ' +
columns
.map((column) => this.formatter.columnize(column) + ' IS NOT NULL')
.join(' AND ');
this.pushQuery(
`CREATE UNIQUE INDEX ${indexName} ON ${this.tableName()} (${this.formatter.columnize(
columns
)})${predicateQuery}`
);
}
}
// Compile a drop index command.
dropIndex(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
this.pushQuery(`DROP INDEX ${indexName} ON ${this.tableName()}`);
}
// Compile a drop foreign key command.
dropForeign(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('foreign', this.tableNameRaw, columns);
this.pushQuery(
`ALTER TABLE ${this.tableName()} DROP CONSTRAINT ${indexName}`
);
}
// Compile a drop primary key command.
dropPrimary(constraintName) {
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
this.pushQuery(
`ALTER TABLE ${this.tableName()} DROP CONSTRAINT ${constraintName}`
);
}
// Compile a drop unique key command.
dropUnique(column, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, column);
this.pushQuery(`DROP INDEX ${indexName} ON ${this.tableName()}`);
}
}
TableCompiler_MSSQL.prototype.createAlterTableMethods = ['foreign', 'primary'];
TableCompiler_MSSQL.prototype.lowerCase = false;
TableCompiler_MSSQL.prototype.addColumnsPrefix = 'ADD ';
TableCompiler_MSSQL.prototype.dropColumnPrefix = 'DROP COLUMN ';
TableCompiler_MSSQL.prototype.alterColumnPrefix = 'ALTER COLUMN ';
module.exports = TableCompiler_MSSQL;

View File

@ -0,0 +1,55 @@
/* eslint max-len: 0 */
const ViewCompiler = require('../../../schema/viewcompiler.js');
const {
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
class ViewCompiler_MSSQL extends ViewCompiler {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
createQuery(columns, selectQuery, materialized, replace) {
const createStatement = 'CREATE ' + (replace ? 'OR ALTER ' : '') + 'VIEW ';
let sql = createStatement + this.viewName();
const columnList = columns
? ' (' +
columnize_(
columns,
this.viewBuilder,
this.client,
this.bindingsHolder
) +
')'
: '';
sql += columnList;
sql += ' AS ';
sql += selectQuery.toString();
this.pushQuery({
sql,
});
}
renameColumn(from, to) {
this.pushQuery(
`exec sp_rename ${this.client.parameter(
this.viewName() + '.' + from,
this.viewBuilder,
this.bindingsHolder
)}, ${this.client.parameter(
to,
this.viewBuilder,
this.bindingsHolder
)}, 'COLUMN'`
);
}
createOrReplace() {
this.createQuery(this.columns, this.selectQuery, false, true);
}
}
module.exports = ViewCompiler_MSSQL;

View File

@ -0,0 +1,176 @@
const Transaction = require('../../execution/transaction');
const debug = require('debug')('knex:tx');
class Transaction_MSSQL extends Transaction {
begin(/** @type {import('tedious').Connection} */ conn) {
debug('transaction::begin id=%s', this.txid);
return new Promise((resolve, reject) => {
conn.beginTransaction(
(err) => {
if (err) {
debug(
'transaction::begin error id=%s message=%s',
this.txid,
err.message
);
return reject(err);
}
resolve();
},
this.outerTx ? this.txid : undefined,
nameToIsolationLevelEnum(this.isolationLevel)
);
}).then(this._resolver, this._rejecter);
}
savepoint(conn) {
debug('transaction::savepoint id=%s', this.txid);
return new Promise((resolve, reject) => {
conn.saveTransaction(
(err) => {
if (err) {
debug(
'transaction::savepoint id=%s message=%s',
this.txid,
err.message
);
return reject(err);
}
this.trxClient.emit('query', {
__knexUid: this.trxClient.__knexUid,
__knexTxId: this.trxClient.__knexTxId,
autogenerated: true,
sql: this.outerTx
? `SAVE TRANSACTION [${this.txid}]`
: `SAVE TRANSACTION`,
});
resolve();
},
this.outerTx ? this.txid : undefined
);
});
}
commit(conn, value) {
debug('transaction::commit id=%s', this.txid);
return new Promise((resolve, reject) => {
conn.commitTransaction(
(err) => {
if (err) {
debug(
'transaction::commit error id=%s message=%s',
this.txid,
err.message
);
return reject(err);
}
this._completed = true;
resolve(value);
},
this.outerTx ? this.txid : undefined
);
}).then(() => this._resolver(value), this._rejecter);
}
release(conn, value) {
return this._resolver(value);
}
rollback(conn, error) {
this._completed = true;
debug('transaction::rollback id=%s', this.txid);
return new Promise((_resolve, reject) => {
if (!conn.inTransaction) {
return reject(
error || new Error('Transaction rejected with non-error: undefined')
);
}
if (conn.state.name !== 'LoggedIn') {
return reject(
new Error(
"Can't rollback transaction. There is a request in progress"
)
);
}
conn.rollbackTransaction(
(err) => {
if (err) {
debug(
'transaction::rollback error id=%s message=%s',
this.txid,
err.message
);
}
reject(
err ||
error ||
new Error('Transaction rejected with non-error: undefined')
);
},
this.outerTx ? this.txid : undefined
);
}).catch((err) => {
if (!error && this.doNotRejectOnRollback) {
this._resolver();
return;
}
if (error) {
try {
err.originalError = error;
} catch (_err) {
// This is to handle https://github.com/knex/knex/issues/4128
}
}
this._rejecter(err);
});
}
rollbackTo(conn, error) {
return this.rollback(conn, error).then(
() =>
void this.trxClient.emit('query', {
__knexUid: this.trxClient.__knexUid,
__knexTxId: this.trxClient.__knexTxId,
autogenerated: true,
sql: `ROLLBACK TRANSACTION`,
})
);
}
}
module.exports = Transaction_MSSQL;
function nameToIsolationLevelEnum(level) {
if (!level) return;
level = level.toUpperCase().replace(' ', '_');
const knownEnum = isolationEnum[level];
if (!knownEnum) {
throw new Error(
`Unknown Isolation level, was expecting one of: ${JSON.stringify(
humanReadableKeys
)}`
);
}
return knownEnum;
}
// Based on: https://github.com/tediousjs/node-mssql/blob/master/lib/isolationlevel.js
const isolationEnum = {
READ_UNCOMMITTED: 0x01,
READ_COMMITTED: 0x02,
REPEATABLE_READ: 0x03,
SERIALIZABLE: 0x04,
SNAPSHOT: 0x05,
};
const humanReadableKeys = Object.keys(isolationEnum).map((key) =>
key.toLowerCase().replace('_', ' ')
);

View File

@ -0,0 +1,206 @@
// MySQL Client
// -------
const defer = require('lodash/defer');
const map = require('lodash/map');
const { promisify } = require('util');
const Client = require('../../client');
const Transaction = require('./transaction');
const QueryBuilder = require('./query/mysql-querybuilder');
const QueryCompiler = require('./query/mysql-querycompiler');
const SchemaCompiler = require('./schema/mysql-compiler');
const TableCompiler = require('./schema/mysql-tablecompiler');
const ColumnCompiler = require('./schema/mysql-columncompiler');
const { makeEscape } = require('../../util/string');
const ViewCompiler = require('./schema/mysql-viewcompiler');
const ViewBuilder = require('./schema/mysql-viewbuilder');
// Always initialize with the "QueryBuilder" and "QueryCompiler"
// objects, which extend the base 'lib/query/builder' and
// 'lib/query/compiler', respectively.
class Client_MySQL extends Client {
_driver() {
return require('mysql');
}
queryBuilder() {
return new QueryBuilder(this);
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
viewBuilder() {
return new ViewBuilder(this, ...arguments);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
transaction() {
return new Transaction(this, ...arguments);
}
wrapIdentifierImpl(value) {
return value !== '*' ? `\`${value.replace(/`/g, '``')}\`` : '*';
}
// Get a raw connection, called by the `pool` whenever a new
// connection needs to be added to the pool.
acquireRawConnection() {
return new Promise((resolver, rejecter) => {
const connection = this.driver.createConnection(this.connectionSettings);
connection.on('error', (err) => {
connection.__knex__disposed = err;
});
connection.connect((err) => {
if (err) {
// if connection is rejected, remove listener that was registered above...
connection.removeAllListeners();
return rejecter(err);
}
resolver(connection);
});
});
}
// Used to explicitly close a connection, called internally by the pool
// when a connection times out or the pool is shutdown.
async destroyRawConnection(connection) {
try {
const end = promisify((cb) => connection.end(cb));
return await end();
} catch (err) {
connection.__knex__disposed = err;
} finally {
// see discussion https://github.com/knex/knex/pull/3483
defer(() => connection.removeAllListeners());
}
}
validateConnection(connection) {
return (
connection.state === 'connected' || connection.state === 'authenticated'
);
}
// Grab a connection, run the query via the MySQL streaming interface,
// and pass that through to the stream we've sent back to the client.
_stream(connection, obj, stream, options) {
if (!obj.sql) throw new Error('The query is empty');
options = options || {};
const queryOptions = Object.assign({ sql: obj.sql }, obj.options);
return new Promise((resolver, rejecter) => {
stream.on('error', rejecter);
stream.on('end', resolver);
const queryStream = connection
.query(queryOptions, obj.bindings)
.stream(options);
queryStream.on('error', (err) => {
rejecter(err);
stream.emit('error', err);
});
queryStream.pipe(stream);
});
}
// Runs the query on the specified connection, providing the bindings
// and any other necessary prep work.
_query(connection, obj) {
if (!obj || typeof obj === 'string') obj = { sql: obj };
if (!obj.sql) throw new Error('The query is empty');
return new Promise(function (resolver, rejecter) {
if (!obj.sql) {
resolver();
return;
}
const queryOptions = Object.assign({ sql: obj.sql }, obj.options);
connection.query(
queryOptions,
obj.bindings,
function (err, rows, fields) {
if (err) return rejecter(err);
obj.response = [rows, fields];
resolver(obj);
}
);
});
}
// Process the response as returned from the query.
processResponse(obj, runner) {
if (obj == null) return;
const { response } = obj;
const { method } = obj;
const rows = response[0];
const fields = response[1];
if (obj.output) return obj.output.call(runner, rows, fields);
switch (method) {
case 'select':
return rows;
case 'first':
return rows[0];
case 'pluck':
return map(rows, obj.pluck);
case 'insert':
return [rows.insertId];
case 'del':
case 'update':
case 'counter':
return rows.affectedRows;
default:
return response;
}
}
async cancelQuery(connectionToKill) {
const conn = await this.acquireRawConnection();
try {
return await this._wrappedCancelQueryCall(conn, connectionToKill);
} finally {
await this.destroyRawConnection(conn);
if (conn.__knex__disposed) {
this.logger.warn(`Connection Error: ${conn.__knex__disposed}`);
}
}
}
_wrappedCancelQueryCall(conn, connectionToKill) {
return this._query(conn, {
sql: 'KILL QUERY ?',
bindings: [connectionToKill.threadId],
options: {},
});
}
}
Object.assign(Client_MySQL.prototype, {
dialect: 'mysql',
driverName: 'mysql',
_escapeBinding: makeEscape(),
canCancelQuery: true,
});
module.exports = Client_MySQL;

View File

@ -0,0 +1,14 @@
const QueryBuilder = require('../../../query/querybuilder');
const isEmpty = require('lodash/isEmpty');
module.exports = class QueryBuilder_MySQL extends QueryBuilder {
upsert(values, returning, options) {
this._method = 'upsert';
if (!isEmpty(returning)) {
this.returning(returning, options);
}
this._single.upsert = values;
return this;
}
};

View File

@ -0,0 +1,292 @@
// MySQL Query Compiler
// ------
const assert = require('assert');
const identity = require('lodash/identity');
const isPlainObject = require('lodash/isPlainObject');
const isEmpty = require('lodash/isEmpty');
const QueryCompiler = require('../../../query/querycompiler');
const { wrapAsIdentifier } = require('../../../formatter/formatterUtils');
const {
columnize: columnize_,
wrap: wrap_,
} = require('../../../formatter/wrappingFormatter');
const isPlainObjectOrArray = (value) =>
isPlainObject(value) || Array.isArray(value);
class QueryCompiler_MySQL extends QueryCompiler {
constructor(client, builder, formatter) {
super(client, builder, formatter);
const { returning } = this.single;
if (returning) {
this.client.logger.warn(
'.returning() is not supported by mysql and will not have any effect.'
);
}
this._emptyInsertValue = '() values ()';
}
// Compiles an `delete` allowing comments
del() {
const sql = super.del();
if (sql === '') return sql;
const comments = this.comments();
return (comments === '' ? '' : comments + ' ') + sql;
}
// Compiles an `insert` query, allowing for multiple
// inserts using a single query statement.
insert() {
let sql = super.insert();
if (sql === '') return sql;
const comments = this.comments();
sql = (comments === '' ? '' : comments + ' ') + sql;
const { ignore, merge, insert } = this.single;
if (ignore) sql = sql.replace('insert into', 'insert ignore into');
if (merge) {
sql += this._merge(merge.updates, insert);
const wheres = this.where();
if (wheres) {
throw new Error(
'.onConflict().merge().where() is not supported for mysql'
);
}
}
return sql;
}
upsert() {
const upsertValues = this.single.upsert || [];
const sql = this.with() + `replace into ${this.tableName} `;
const body = this._insertBody(upsertValues);
return body === '' ? '' : sql + body;
}
// Compiles merge for onConflict, allowing for different merge strategies
_merge(updates, insert) {
const sql = ' on duplicate key update ';
if (updates && Array.isArray(updates)) {
// update subset of columns
return (
sql +
updates
.map((column) =>
wrapAsIdentifier(column, this.formatter.builder, this.client)
)
.map((column) => `${column} = values(${column})`)
.join(', ')
);
} else if (updates && typeof updates === 'object') {
const updateData = this._prepUpdate(updates);
return sql + updateData.join(',');
} else {
const insertData = this._prepInsert(insert);
if (typeof insertData === 'string') {
throw new Error(
'If using merge with a raw insert query, then updates must be provided'
);
}
return (
sql +
insertData.columns
.map((column) => wrapAsIdentifier(column, this.builder, this.client))
.map((column) => `${column} = values(${column})`)
.join(', ')
);
}
}
// Update method, including joins, wheres, order & limits.
update() {
const comments = this.comments();
const withSQL = this.with();
const join = this.join();
const updates = this._prepUpdate(this.single.update);
const where = this.where();
const order = this.order();
const limit = this.limit();
return (
(comments === '' ? '' : comments + ' ') +
withSQL +
`update ${this.tableName}` +
(join ? ` ${join}` : '') +
' set ' +
updates.join(', ') +
(where ? ` ${where}` : '') +
(order ? ` ${order}` : '') +
(limit ? ` ${limit}` : '')
);
}
forUpdate() {
return 'for update';
}
forShare() {
return 'lock in share mode';
}
// Only supported on MySQL 8.0+
skipLocked() {
return 'skip locked';
}
// Supported on MySQL 8.0+ and MariaDB 10.3.0+
noWait() {
return 'nowait';
}
// Compiles a `columnInfo` query.
columnInfo() {
const column = this.single.columnInfo;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
return {
sql: 'select * from information_schema.columns where table_name = ? and table_schema = ?',
bindings: [table, this.client.database()],
output(resp) {
const out = resp.reduce(function (columns, val) {
columns[val.COLUMN_NAME] = {
defaultValue:
val.COLUMN_DEFAULT === 'NULL' ? null : val.COLUMN_DEFAULT,
type: val.DATA_TYPE,
maxLength: val.CHARACTER_MAXIMUM_LENGTH,
nullable: val.IS_NULLABLE === 'YES',
};
return columns;
}, {});
return (column && out[column]) || out;
},
};
}
limit() {
const noLimit = !this.single.limit && this.single.limit !== 0;
if (noLimit && !this.single.offset) return '';
// Workaround for offset only.
// see: http://stackoverflow.com/questions/255517/mysql-offset-infinite-rows
const limit =
this.single.offset && noLimit
? '18446744073709551615'
: this._getValueOrParameterFromAttribute('limit');
return `limit ${limit}`;
}
whereBasic(statement) {
assert(
!isPlainObjectOrArray(statement.value),
'The values in where clause must not be object or array.'
);
return super.whereBasic(statement);
}
whereRaw(statement) {
assert(
isEmpty(statement.value.bindings) ||
!Object.values(statement.value.bindings).some(isPlainObjectOrArray),
'The values in where clause must not be object or array.'
);
return super.whereRaw(statement);
}
whereLike(statement) {
return `${this._columnClause(statement)} ${this._not(
statement,
'like '
)}${this._valueClause(statement)} COLLATE utf8_bin`;
}
whereILike(statement) {
return `${this._columnClause(statement)} ${this._not(
statement,
'like '
)}${this._valueClause(statement)}`;
}
// Json functions
jsonExtract(params) {
return this._jsonExtract(['json_extract', 'json_unquote'], params);
}
jsonSet(params) {
return this._jsonSet('json_set', params);
}
jsonInsert(params) {
return this._jsonSet('json_insert', params);
}
jsonRemove(params) {
const jsonCol = `json_remove(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)},${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)})`;
return params.alias
? this.client.alias(jsonCol, this.formatter.wrap(params.alias))
: jsonCol;
}
whereJsonObject(statement) {
return this._not(
statement,
`json_contains(${this._columnClause(statement)}, ${this._jsonValueClause(
statement
)})`
);
}
whereJsonPath(statement) {
return this._whereJsonPath('json_extract', statement);
}
whereJsonSupersetOf(statement) {
return this._not(
statement,
`json_contains(${wrap_(
statement.column,
undefined,
this.builder,
this.client,
this.bindingsHolder
)},${this._jsonValueClause(statement)})`
);
}
whereJsonSubsetOf(statement) {
return this._not(
statement,
`json_contains(${this._jsonValueClause(statement)},${wrap_(
statement.column,
undefined,
this.builder,
this.client,
this.bindingsHolder
)})`
);
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('json_extract', clause);
}
}
// Set the QueryBuilder & QueryCompiler on the client object,
// in case anyone wants to modify things to suit their own purposes.
module.exports = QueryCompiler_MySQL;

View File

@ -0,0 +1,193 @@
// MySQL Column Compiler
// -------
const ColumnCompiler = require('../../../schema/columncompiler');
const { isObject } = require('../../../util/is');
const { toNumber } = require('../../../util/helpers');
const commentEscapeRegex = /(?<!\\)'/g;
class ColumnCompiler_MySQL extends ColumnCompiler {
constructor(client, tableCompiler, columnBuilder) {
super(client, tableCompiler, columnBuilder);
this.modifiers = [
'unsigned',
'nullable',
'defaultTo',
'comment',
'collate',
'first',
'after',
];
this._addCheckModifiers();
}
// Types
// ------
double(precision, scale) {
if (!precision) return 'double';
return `double(${toNumber(precision, 8)}, ${toNumber(scale, 2)})`;
}
integer(length) {
length = length ? `(${toNumber(length, 11)})` : '';
return `int${length}`;
}
tinyint(length) {
length = length ? `(${toNumber(length, 1)})` : '';
return `tinyint${length}`;
}
text(column) {
switch (column) {
case 'medium':
case 'mediumtext':
return 'mediumtext';
case 'long':
case 'longtext':
return 'longtext';
default:
return 'text';
}
}
mediumtext() {
return this.text('medium');
}
longtext() {
return this.text('long');
}
enu(allowed) {
return `enum('${allowed.join("', '")}')`;
}
datetime(precision) {
if (isObject(precision)) {
({ precision } = precision);
}
return typeof precision === 'number'
? `datetime(${precision})`
: 'datetime';
}
timestamp(precision) {
if (isObject(precision)) {
({ precision } = precision);
}
return typeof precision === 'number'
? `timestamp(${precision})`
: 'timestamp';
}
time(precision) {
if (isObject(precision)) {
({ precision } = precision);
}
return typeof precision === 'number' ? `time(${precision})` : 'time';
}
bit(length) {
return length ? `bit(${toNumber(length)})` : 'bit';
}
binary(length) {
return length ? `varbinary(${toNumber(length)})` : 'blob';
}
json() {
return 'json';
}
jsonb() {
return 'json';
}
// Modifiers
// ------
defaultTo(value) {
// MySQL defaults to null by default, but breaks down if you pass it explicitly
// Note that in MySQL versions up to 5.7, logic related to updating
// timestamps when no explicit value is passed is quite insane - https://dev.mysql.com/doc/refman/5.7/en/server-system-variables.html#sysvar_explicit_defaults_for_timestamp
if (value === null || value === undefined) {
return;
}
if ((this.type === 'json' || this.type === 'jsonb') && isObject(value)) {
// Default value for json will work only it is an expression
return `default ('${JSON.stringify(value)}')`;
}
const defaultVal = super.defaultTo.apply(this, arguments);
if (this.type !== 'blob' && this.type.indexOf('text') === -1) {
return defaultVal;
}
return '';
}
unsigned() {
return 'unsigned';
}
comment(comment) {
if (comment && comment.length > 255) {
this.client.logger.warn(
'Your comment is longer than the max comment length for MySQL'
);
}
return comment && `comment '${comment.replace(commentEscapeRegex, "\\'")}'`;
}
first() {
return 'first';
}
after(column) {
return `after ${this.formatter.wrap(column)}`;
}
collate(collation) {
return collation && `collate '${collation}'`;
}
checkRegex(regex, constraintName) {
return this._check(
`${this.formatter.wrap(
this.getColumnName()
)} REGEXP ${this.client._escapeBinding(regex)}`,
constraintName
);
}
increments(options = { primaryKey: true }) {
return (
'int unsigned not null' +
// In MySQL autoincrement are always a primary key. If you already have a primary key, we
// initialize this column as classic int column then modify it later in table compiler
(this.tableCompiler._canBeAddPrimaryKey(options)
? ' auto_increment primary key'
: '')
);
}
bigincrements(options = { primaryKey: true }) {
return (
'bigint unsigned not null' +
// In MySQL autoincrement are always a primary key. If you already have a primary key, we
// initialize this column as classic int column then modify it later in table compiler
(this.tableCompiler._canBeAddPrimaryKey(options)
? ' auto_increment primary key'
: '')
);
}
}
ColumnCompiler_MySQL.prototype.bigint = 'bigint';
ColumnCompiler_MySQL.prototype.mediumint = 'mediumint';
ColumnCompiler_MySQL.prototype.smallint = 'smallint';
module.exports = ColumnCompiler_MySQL;

View File

@ -0,0 +1,60 @@
// MySQL Schema Compiler
// -------
const SchemaCompiler = require('../../../schema/compiler');
class SchemaCompiler_MySQL extends SchemaCompiler {
constructor(client, builder) {
super(client, builder);
}
// Rename a table on the schema.
renameTable(tableName, to) {
this.pushQuery(
`rename table ${this.formatter.wrap(tableName)} to ${this.formatter.wrap(
to
)}`
);
}
renameView(from, to) {
this.renameTable(from, to);
}
// Check whether a table exists on the query.
hasTable(tableName) {
let sql = 'select * from information_schema.tables where table_name = ?';
const bindings = [tableName];
if (this.schema) {
sql += ' and table_schema = ?';
bindings.push(this.schema);
} else {
sql += ' and table_schema = database()';
}
this.pushQuery({
sql,
bindings,
output: function output(resp) {
return resp.length > 0;
},
});
}
// Check whether a column exists on the schema.
hasColumn(tableName, column) {
this.pushQuery({
sql: `show columns from ${this.formatter.wrap(tableName)}`,
output(resp) {
return resp.some((row) => {
return (
this.client.wrapIdentifier(row.Field.toLowerCase()) ===
this.client.wrapIdentifier(column.toLowerCase())
);
});
},
});
}
}
module.exports = SchemaCompiler_MySQL;

View File

@ -0,0 +1,405 @@
/* eslint max-len:0*/
// MySQL Table Builder & Compiler
// -------
const TableCompiler = require('../../../schema/tablecompiler');
const { isObject, isString } = require('../../../util/is');
// Table Compiler
// ------
class TableCompiler_MySQL extends TableCompiler {
constructor(client, tableBuilder) {
super(client, tableBuilder);
}
createQuery(columns, ifNot, like) {
const createStatement = ifNot
? 'create table if not exists '
: 'create table ';
const { client } = this;
let conn = {};
let columnsSql = ' (' + columns.sql.join(', ');
columnsSql += this.primaryKeys() || '';
columnsSql += this._addChecks();
columnsSql += ')';
let sql =
createStatement +
this.tableName() +
(like && this.tableNameLike()
? ' like ' + this.tableNameLike()
: columnsSql);
// Check if the connection settings are set.
if (client.connectionSettings) {
conn = client.connectionSettings;
}
const charset = this.single.charset || conn.charset || '';
const collation = this.single.collate || conn.collate || '';
const engine = this.single.engine || '';
if (charset && !like) sql += ` default character set ${charset}`;
if (collation) sql += ` collate ${collation}`;
if (engine) sql += ` engine = ${engine}`;
if (this.single.comment) {
const comment = this.single.comment || '';
const MAX_COMMENT_LENGTH = 1024;
if (comment.length > MAX_COMMENT_LENGTH)
this.client.logger.warn(
`The max length for a table comment is ${MAX_COMMENT_LENGTH} characters`
);
sql += ` comment = '${comment}'`;
}
this.pushQuery(sql);
if (like) {
this.addColumns(columns, this.addColumnsPrefix);
}
}
// Compiles the comment on the table.
comment(comment) {
this.pushQuery(`alter table ${this.tableName()} comment = '${comment}'`);
}
changeType() {
// alter table + table + ' modify ' + wrapped + '// type';
}
// Renames a column on the table.
renameColumn(from, to) {
const compiler = this;
const table = this.tableName();
const wrapped = this.formatter.wrap(from) + ' ' + this.formatter.wrap(to);
this.pushQuery({
sql:
`show full fields from ${table} where field = ` +
this.client.parameter(from, this.tableBuilder, this.bindingsHolder),
output(resp) {
const column = resp[0];
const runner = this;
return compiler.getFKRefs(runner).then(([refs]) =>
new Promise((resolve, reject) => {
try {
if (!refs.length) {
resolve();
}
resolve(compiler.dropFKRefs(runner, refs));
} catch (e) {
reject(e);
}
})
.then(function () {
let sql = `alter table ${table} change ${wrapped} ${column.Type}`;
if (String(column.Null).toUpperCase() !== 'YES') {
sql += ` NOT NULL`;
} else {
// This doesn't matter for most cases except Timestamp, where this is important
sql += ` NULL`;
}
if (column.Default !== void 0 && column.Default !== null) {
sql += ` DEFAULT '${column.Default}'`;
}
if (column.Collation !== void 0 && column.Collation !== null) {
sql += ` COLLATE '${column.Collation}'`;
}
// Add back the auto increment if the column it, fix issue #2767
if (column.Extra == 'auto_increment') {
sql += ` AUTO_INCREMENT`;
}
return runner.query({
sql,
});
})
.then(function () {
if (!refs.length) {
return;
}
return compiler.createFKRefs(
runner,
refs.map(function (ref) {
if (ref.REFERENCED_COLUMN_NAME === from) {
ref.REFERENCED_COLUMN_NAME = to;
}
if (ref.COLUMN_NAME === from) {
ref.COLUMN_NAME = to;
}
return ref;
})
);
})
);
},
});
}
primaryKeys() {
const pks = (this.grouped.alterTable || []).filter(
(k) => k.method === 'primary'
);
if (pks.length > 0 && pks[0].args.length > 0) {
const columns = pks[0].args[0];
let constraintName = pks[0].args[1] || '';
if (constraintName) {
constraintName = ' constraint ' + this.formatter.wrap(constraintName);
}
if (this.grouped.columns) {
const incrementsCols = this._getIncrementsColumnNames();
if (incrementsCols.length) {
incrementsCols.forEach((c) => {
if (!columns.includes(c)) {
columns.unshift(c);
}
});
}
const bigIncrementsCols = this._getBigIncrementsColumnNames();
if (bigIncrementsCols.length) {
bigIncrementsCols.forEach((c) => {
if (!columns.includes(c)) {
columns.unshift(c);
}
});
}
}
return `,${constraintName} primary key (${this.formatter.columnize(
columns
)})`;
}
}
getFKRefs(runner) {
const bindingsHolder = {
bindings: [],
};
const sql =
'SELECT KCU.CONSTRAINT_NAME, KCU.TABLE_NAME, KCU.COLUMN_NAME, ' +
' KCU.REFERENCED_TABLE_NAME, KCU.REFERENCED_COLUMN_NAME, ' +
' RC.UPDATE_RULE, RC.DELETE_RULE ' +
'FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE AS KCU ' +
'JOIN INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS AS RC ' +
' USING(CONSTRAINT_NAME)' +
'WHERE KCU.REFERENCED_TABLE_NAME = ' +
this.client.parameter(
this.tableNameRaw,
this.tableBuilder,
bindingsHolder
) +
' ' +
' AND KCU.CONSTRAINT_SCHEMA = ' +
this.client.parameter(
this.client.database(),
this.tableBuilder,
bindingsHolder
) +
' ' +
' AND RC.CONSTRAINT_SCHEMA = ' +
this.client.parameter(
this.client.database(),
this.tableBuilder,
bindingsHolder
);
return runner.query({
sql,
bindings: bindingsHolder.bindings,
});
}
dropFKRefs(runner, refs) {
const formatter = this.client.formatter(this.tableBuilder);
return Promise.all(
refs.map(function (ref) {
const constraintName = formatter.wrap(ref.CONSTRAINT_NAME);
const tableName = formatter.wrap(ref.TABLE_NAME);
return runner.query({
sql: `alter table ${tableName} drop foreign key ${constraintName}`,
});
})
);
}
createFKRefs(runner, refs) {
const formatter = this.client.formatter(this.tableBuilder);
return Promise.all(
refs.map(function (ref) {
const tableName = formatter.wrap(ref.TABLE_NAME);
const keyName = formatter.wrap(ref.CONSTRAINT_NAME);
const column = formatter.columnize(ref.COLUMN_NAME);
const references = formatter.columnize(ref.REFERENCED_COLUMN_NAME);
const inTable = formatter.wrap(ref.REFERENCED_TABLE_NAME);
const onUpdate = ` ON UPDATE ${ref.UPDATE_RULE}`;
const onDelete = ` ON DELETE ${ref.DELETE_RULE}`;
return runner.query({
sql:
`alter table ${tableName} add constraint ${keyName} ` +
'foreign key (' +
column +
') references ' +
inTable +
' (' +
references +
')' +
onUpdate +
onDelete,
});
})
);
}
index(columns, indexName, options) {
let storageEngineIndexType;
let indexType;
if (isString(options)) {
indexType = options;
} else if (isObject(options)) {
({ indexType, storageEngineIndexType } = options);
}
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
storageEngineIndexType = storageEngineIndexType
? ` using ${storageEngineIndexType}`
: '';
this.pushQuery(
`alter table ${this.tableName()} add${
indexType ? ` ${indexType}` : ''
} index ${indexName}(${this.formatter.columnize(
columns
)})${storageEngineIndexType}`
);
}
primary(columns, constraintName) {
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`mysql: primary key constraint \`${constraintName}\` will not be deferrable ${deferrable} because mysql does not support deferred constraints.`
);
}
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
const primaryCols = columns;
let incrementsCols = [];
let bigIncrementsCols = [];
if (this.grouped.columns) {
incrementsCols = this._getIncrementsColumnNames();
if (incrementsCols) {
incrementsCols.forEach((c) => {
if (!primaryCols.includes(c)) {
primaryCols.unshift(c);
}
});
}
bigIncrementsCols = this._getBigIncrementsColumnNames();
if (bigIncrementsCols) {
bigIncrementsCols.forEach((c) => {
if (!primaryCols.includes(c)) {
primaryCols.unshift(c);
}
});
}
}
if (this.method !== 'create' && this.method !== 'createIfNot') {
this.pushQuery(
`alter table ${this.tableName()} add primary key ${constraintName}(${this.formatter.columnize(
primaryCols
)})`
);
}
if (incrementsCols.length) {
this.pushQuery(
`alter table ${this.tableName()} modify column ${this.formatter.columnize(
incrementsCols
)} int unsigned not null auto_increment`
);
}
if (bigIncrementsCols.length) {
this.pushQuery(
`alter table ${this.tableName()} modify column ${this.formatter.columnize(
bigIncrementsCols
)} bigint unsigned not null auto_increment`
);
}
}
unique(columns, indexName) {
let storageEngineIndexType;
let deferrable;
if (isObject(indexName)) {
({ indexName, deferrable, storageEngineIndexType } = indexName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`mysql: unique index \`${indexName}\` will not be deferrable ${deferrable} because mysql does not support deferred constraints.`
);
}
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
storageEngineIndexType = storageEngineIndexType
? ` using ${storageEngineIndexType}`
: '';
this.pushQuery(
`alter table ${this.tableName()} add unique ${indexName}(${this.formatter.columnize(
columns
)})${storageEngineIndexType}`
);
}
// Compile a drop index command.
dropIndex(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
this.pushQuery(`alter table ${this.tableName()} drop index ${indexName}`);
}
// Compile a drop foreign key command.
dropForeign(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('foreign', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} drop foreign key ${indexName}`
);
}
// Compile a drop primary key command.
dropPrimary() {
this.pushQuery(`alter table ${this.tableName()} drop primary key`);
}
// Compile a drop unique key command.
dropUnique(column, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, column);
this.pushQuery(`alter table ${this.tableName()} drop index ${indexName}`);
}
}
TableCompiler_MySQL.prototype.addColumnsPrefix = 'add ';
TableCompiler_MySQL.prototype.alterColumnsPrefix = 'modify ';
TableCompiler_MySQL.prototype.dropColumnPrefix = 'drop ';
module.exports = TableCompiler_MySQL;

View File

@ -0,0 +1,21 @@
const ViewBuilder = require('../../../schema/viewbuilder.js');
class ViewBuilder_MySQL extends ViewBuilder {
constructor() {
super(...arguments);
}
checkOption() {
this._single.checkOption = 'default_option';
}
localCheckOption() {
this._single.checkOption = 'local';
}
cascadedCheckOption() {
this._single.checkOption = 'cascaded';
}
}
module.exports = ViewBuilder_MySQL;

View File

@ -0,0 +1,15 @@
/* eslint max-len: 0 */
const ViewCompiler = require('../../../schema/viewcompiler.js');
class ViewCompiler_MySQL extends ViewCompiler {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
createOrReplace() {
this.createQuery(this.columns, this.selectQuery, false, true);
}
}
module.exports = ViewCompiler_MySQL;

View File

@ -0,0 +1,46 @@
const Transaction = require('../../execution/transaction');
const Debug = require('debug');
const debug = Debug('knex:tx');
class Transaction_MySQL extends Transaction {
query(conn, sql, status, value) {
const t = this;
const q = this.trxClient
.query(conn, sql)
.catch((err) => {
if (err.errno === 1305) {
this.trxClient.logger.warn(
'Transaction was implicitly committed, do not mix transactions and ' +
'DDL with MySQL (#805)'
);
return;
}
status = 2;
value = err;
t._completed = true;
debug('%s error running transaction query', t.txid);
})
.then(function (res) {
if (status === 1) t._resolver(value);
if (status === 2) {
if (value === undefined) {
if (t.doNotRejectOnRollback && /^ROLLBACK\b/i.test(sql)) {
t._resolver();
return;
}
value = new Error(`Transaction rejected with non-error: ${value}`);
}
t._rejecter(value);
}
return res;
});
if (status === 1 || status === 2) {
t._completed = true;
}
return q;
}
}
module.exports = Transaction_MySQL;

View File

@ -0,0 +1,53 @@
// MySQL2 Client
// -------
const Client_MySQL = require('../mysql');
const Transaction = require('./transaction');
// Always initialize with the "QueryBuilder" and "QueryCompiler"
// objects, which extend the base 'lib/query/builder' and
// 'lib/query/compiler', respectively.
class Client_MySQL2 extends Client_MySQL {
transaction() {
return new Transaction(this, ...arguments);
}
_driver() {
return require('mysql2');
}
initializeDriver() {
try {
this.driver = this._driver();
} catch (e) {
let message = `Knex: run\n$ npm install ${this.driverName}`;
const nodeMajorVersion = process.version.replace(/^v/, '').split('.')[0];
if (nodeMajorVersion <= 12) {
message += `@3.2.0`;
this.logger.error(
'Mysql2 version 3.2.0 is the latest version to support Node.js 12 or lower.'
);
}
message += ` --save`;
this.logger.error(`${message}\n${e.message}\n${e.stack}`);
throw new Error(`${message}\n${e.message}`);
}
}
validateConnection(connection) {
return (
connection &&
!connection._fatalError &&
!connection._protocolError &&
!connection._closing &&
!connection.stream.destroyed
);
}
}
Object.assign(Client_MySQL2.prototype, {
// The "dialect", for reference elsewhere.
driverName: 'mysql2',
});
module.exports = Client_MySQL2;

View File

@ -0,0 +1,44 @@
const Transaction = require('../../execution/transaction');
const debug = require('debug')('knex:tx');
class Transaction_MySQL2 extends Transaction {
query(conn, sql, status, value) {
const t = this;
const q = this.trxClient
.query(conn, sql)
.catch((err) => {
if (err.code === 'ER_SP_DOES_NOT_EXIST') {
this.trxClient.logger.warn(
'Transaction was implicitly committed, do not mix transactions and ' +
'DDL with MySQL (#805)'
);
return;
}
status = 2;
value = err;
t._completed = true;
debug('%s error running transaction query', t.txid);
})
.then(function (res) {
if (status === 1) t._resolver(value);
if (status === 2) {
if (value === undefined) {
if (t.doNotRejectOnRollback && /^ROLLBACK\b/i.test(sql)) {
t._resolver();
return;
}
value = new Error(`Transaction rejected with non-error: ${value}`);
}
t._rejecter(value);
return res;
}
});
if (status === 1 || status === 2) {
t._completed = true;
}
return q;
}
}
module.exports = Transaction_MySQL2;

View File

@ -0,0 +1,5 @@
# Warning: Dead Code
The `oracle` dialect is mostly dead code at this point. However, a handful of its methods are still referenced by the `oracledb` dialect. So, we are in the process of migrating those methods over to the `oracledb` dialect where they belong. Once that task is completed, we will officially remove the `oracle` dialect.
In short: do not use the `oracle` dialect. Use the `oracledb` dialect instead.

View File

@ -0,0 +1,92 @@
// Oracle Client
// -------
const { ReturningHelper } = require('./utils');
const { isConnectionError } = require('./utils');
const Client = require('../../client');
const SchemaCompiler = require('./schema/oracle-compiler');
const ColumnBuilder = require('./schema/oracle-columnbuilder');
const ColumnCompiler = require('./schema/oracle-columncompiler');
const TableCompiler = require('./schema/oracle-tablecompiler');
// Always initialize with the "QueryBuilder" and "QueryCompiler"
// objects, which extend the base 'lib/query/builder' and
// 'lib/query/compiler', respectively.
class Client_Oracle extends Client {
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
columnBuilder() {
return new ColumnBuilder(this, ...arguments);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
// Return the database for the Oracle client.
database() {
return this.connectionSettings.database;
}
// Position the bindings for the query.
positionBindings(sql) {
let questionCount = 0;
return sql.replace(/\?/g, function () {
questionCount += 1;
return `:${questionCount}`;
});
}
_stream(connection, obj, stream, options) {
if (!obj.sql) throw new Error('The query is empty');
return new Promise(function (resolver, rejecter) {
stream.on('error', (err) => {
if (isConnectionError(err)) {
connection.__knex__disposed = err;
}
rejecter(err);
});
stream.on('end', resolver);
const queryStream = connection.queryStream(
obj.sql,
obj.bindings,
options
);
queryStream.pipe(stream);
queryStream.on('error', function (error) {
rejecter(error);
stream.emit('error', error);
});
});
}
// Formatter part
alias(first, second) {
return first + ' ' + second;
}
parameter(value, builder, formatter) {
// Returning helper uses always ROWID as string
if (value instanceof ReturningHelper && this.driver) {
value = new this.driver.OutParam(this.driver.OCCISTRING);
} else if (typeof value === 'boolean') {
value = value ? 1 : 0;
}
return super.parameter(value, builder, formatter);
}
}
Object.assign(Client_Oracle.prototype, {
dialect: 'oracle',
driverName: 'oracle',
});
module.exports = Client_Oracle;

View File

@ -0,0 +1,343 @@
/* eslint max-len:0 */
// Oracle Query Builder & Compiler
// ------
const compact = require('lodash/compact');
const identity = require('lodash/identity');
const isEmpty = require('lodash/isEmpty');
const isPlainObject = require('lodash/isPlainObject');
const reduce = require('lodash/reduce');
const QueryCompiler = require('../../../query/querycompiler');
const { ReturningHelper } = require('../utils');
const { isString } = require('../../../util/is');
const components = [
'comments',
'columns',
'join',
'where',
'union',
'group',
'having',
'order',
'lock',
];
// Query Compiler
// -------
// Set the "Formatter" to use for the queries,
// ensuring that all parameterized values (even across sub-queries)
// are properly built into the same query.
class QueryCompiler_Oracle extends QueryCompiler {
constructor(client, builder, formatter) {
super(client, builder, formatter);
const { onConflict } = this.single;
if (onConflict) {
throw new Error('.onConflict() is not supported for oracledb.');
}
// Compiles the `select` statement, or nested sub-selects
// by calling each of the component compilers, trimming out
// the empties, and returning a generated query string.
this.first = this.select;
}
// Compiles an "insert" query, allowing for multiple
// inserts using a single query statement.
insert() {
let insertValues = this.single.insert || [];
let { returning } = this.single;
if (!Array.isArray(insertValues) && isPlainObject(this.single.insert)) {
insertValues = [this.single.insert];
}
// always wrap returning argument in array
if (returning && !Array.isArray(returning)) {
returning = [returning];
}
if (
Array.isArray(insertValues) &&
insertValues.length === 1 &&
isEmpty(insertValues[0])
) {
return this._addReturningToSqlAndConvert(
`insert into ${this.tableName} (${this.formatter.wrap(
this.single.returning
)}) values (default)`,
returning,
this.tableName
);
}
if (
isEmpty(this.single.insert) &&
typeof this.single.insert !== 'function'
) {
return '';
}
const insertData = this._prepInsert(insertValues);
const sql = {};
if (isString(insertData)) {
return this._addReturningToSqlAndConvert(
`insert into ${this.tableName} ${insertData}`,
returning
);
}
if (insertData.values.length === 1) {
return this._addReturningToSqlAndConvert(
`insert into ${this.tableName} (${this.formatter.columnize(
insertData.columns
)}) values (${this.client.parameterize(
insertData.values[0],
undefined,
this.builder,
this.bindingsHolder
)})`,
returning,
this.tableName
);
}
const insertDefaultsOnly = insertData.columns.length === 0;
sql.sql =
'begin ' +
insertData.values
.map((value) => {
let returningHelper;
const parameterizedValues = !insertDefaultsOnly
? this.client.parameterize(
value,
this.client.valueForUndefined,
this.builder,
this.bindingsHolder
)
: '';
const returningValues = Array.isArray(returning)
? returning
: [returning];
let subSql = `insert into ${this.tableName} `;
if (returning) {
returningHelper = new ReturningHelper(returningValues.join(':'));
sql.outParams = (sql.outParams || []).concat(returningHelper);
}
if (insertDefaultsOnly) {
// no columns given so only the default value
subSql += `(${this.formatter.wrap(
this.single.returning
)}) values (default)`;
} else {
subSql += `(${this.formatter.columnize(
insertData.columns
)}) values (${parameterizedValues})`;
}
subSql += returning
? ` returning ROWID into ${this.client.parameter(
returningHelper,
this.builder,
this.bindingsHolder
)}`
: '';
// pre bind position because subSql is an execute immediate parameter
// later position binding will only convert the ? params
subSql = this.formatter.client.positionBindings(subSql);
const parameterizedValuesWithoutDefault = parameterizedValues
.replace('DEFAULT, ', '')
.replace(', DEFAULT', '');
return (
`execute immediate '${subSql.replace(/'/g, "''")}` +
(parameterizedValuesWithoutDefault || returning ? "' using " : '') +
parameterizedValuesWithoutDefault +
(parameterizedValuesWithoutDefault && returning ? ', ' : '') +
(returning ? 'out ?' : '') +
';'
);
})
.join(' ') +
'end;';
if (returning) {
sql.returning = returning;
// generate select statement with special order by to keep the order because 'in (..)' may change the order
sql.returningSql =
`select ${this.formatter.columnize(returning)}` +
' from ' +
this.tableName +
' where ROWID in (' +
sql.outParams.map((v, i) => `:${i + 1}`).join(', ') +
')' +
' order by case ROWID ' +
sql.outParams
.map((v, i) => `when CHARTOROWID(:${i + 1}) then ${i}`)
.join(' ') +
' end';
}
return sql;
}
// Update method, including joins, wheres, order & limits.
update() {
const updates = this._prepUpdate(this.single.update);
const where = this.where();
let { returning } = this.single;
const sql =
`update ${this.tableName}` +
' set ' +
updates.join(', ') +
(where ? ` ${where}` : '');
if (!returning) {
return sql;
}
// always wrap returning argument in array
if (!Array.isArray(returning)) {
returning = [returning];
}
return this._addReturningToSqlAndConvert(sql, returning, this.tableName);
}
// Compiles a `truncate` query.
truncate() {
return `truncate table ${this.tableName}`;
}
forUpdate() {
return 'for update';
}
forShare() {
// lock for share is not directly supported by oracle
// use LOCK TABLE .. IN SHARE MODE; instead
this.client.logger.warn(
'lock for share is not supported by oracle dialect'
);
return '';
}
// Compiles a `columnInfo` query.
columnInfo() {
const column = this.single.columnInfo;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
// Node oracle drivers doesn't support LONG type (which is data_default type)
const sql = `select * from xmltable( '/ROWSET/ROW'
passing dbms_xmlgen.getXMLType('
select char_col_decl_length, column_name, data_type, data_default, nullable
from all_tab_columns where table_name = ''${table}'' ')
columns
CHAR_COL_DECL_LENGTH number, COLUMN_NAME varchar2(200), DATA_TYPE varchar2(106),
DATA_DEFAULT clob, NULLABLE varchar2(1))`;
return {
sql: sql,
output(resp) {
const out = reduce(
resp,
function (columns, val) {
columns[val.COLUMN_NAME] = {
type: val.DATA_TYPE,
defaultValue: val.DATA_DEFAULT,
maxLength: val.CHAR_COL_DECL_LENGTH,
nullable: val.NULLABLE === 'Y',
};
return columns;
},
{}
);
return (column && out[column]) || out;
},
};
}
select() {
let query = this.with();
const statements = components.map((component) => {
return this[component]();
});
query += compact(statements).join(' ');
return this._surroundQueryWithLimitAndOffset(query);
}
aggregate(stmt) {
return this._aggregate(stmt, { aliasSeparator: ' ' });
}
// for single commands only
_addReturningToSqlAndConvert(sql, returning, tableName) {
const res = {
sql,
};
if (!returning) {
return res;
}
const returningValues = Array.isArray(returning) ? returning : [returning];
const returningHelper = new ReturningHelper(returningValues.join(':'));
res.sql =
sql +
' returning ROWID into ' +
this.client.parameter(returningHelper, this.builder, this.bindingsHolder);
res.returningSql = `select ${this.formatter.columnize(
returning
)} from ${tableName} where ROWID = :1`;
res.outParams = [returningHelper];
res.returning = returning;
return res;
}
_surroundQueryWithLimitAndOffset(query) {
let { limit } = this.single;
const { offset } = this.single;
const hasLimit = limit || limit === 0 || limit === '0';
limit = +limit;
if (!hasLimit && !offset) return query;
query = query || '';
if (hasLimit && !offset) {
return `select * from (${query}) where rownum <= ${this._getValueOrParameterFromAttribute(
'limit',
limit
)}`;
}
const endRow = +offset + (hasLimit ? limit : 10000000000000);
return (
'select * from ' +
'(select row_.*, ROWNUM rownum_ from (' +
query +
') row_ ' +
'where rownum <= ' +
(this.single.skipBinding['offset']
? endRow
: this.client.parameter(endRow, this.builder, this.bindingsHolder)) +
') ' +
'where rownum_ > ' +
this._getValueOrParameterFromAttribute('offset', offset)
);
}
}
module.exports = QueryCompiler_Oracle;

View File

@ -0,0 +1,22 @@
const Trigger = require('./trigger');
// helper function for pushAdditional in increments() and bigincrements()
function createAutoIncrementTriggerAndSequence(columnCompiler) {
const trigger = new Trigger(columnCompiler.client.version);
// TODO Add warning that sequence etc is created
columnCompiler.pushAdditional(function () {
const tableName = this.tableCompiler.tableNameRaw;
const schemaName = this.tableCompiler.schemaNameRaw;
const createTriggerSQL = trigger.createAutoIncrementTrigger(
this.client.logger,
tableName,
schemaName
);
this.pushQuery(createTriggerSQL);
});
}
module.exports = {
createAutoIncrementTriggerAndSequence,
};

View File

@ -0,0 +1,155 @@
const { NameHelper } = require('../../utils');
class Trigger {
constructor(oracleVersion) {
this.nameHelper = new NameHelper(oracleVersion);
}
renameColumnTrigger(logger, tableName, columnName, to) {
const triggerName = this.nameHelper.generateCombinedName(
logger,
'autoinc_trg',
tableName
);
const sequenceName = this.nameHelper.generateCombinedName(
logger,
'seq',
tableName
);
return (
`DECLARE ` +
`PK_NAME VARCHAR(200); ` +
`IS_AUTOINC NUMBER := 0; ` +
`BEGIN` +
` EXECUTE IMMEDIATE ('ALTER TABLE "${tableName}" RENAME COLUMN "${columnName}" TO "${to}"');` +
` SELECT COUNT(*) INTO IS_AUTOINC from "USER_TRIGGERS" where trigger_name = '${triggerName}';` +
` IF (IS_AUTOINC > 0) THEN` +
` SELECT cols.column_name INTO PK_NAME` +
` FROM all_constraints cons, all_cons_columns cols` +
` WHERE cons.constraint_type = 'P'` +
` AND cons.constraint_name = cols.constraint_name` +
` AND cons.owner = cols.owner` +
` AND cols.table_name = '${tableName}';` +
` IF ('${to}' = PK_NAME) THEN` +
` EXECUTE IMMEDIATE ('DROP TRIGGER "${triggerName}"');` +
` EXECUTE IMMEDIATE ('create or replace trigger "${triggerName}"` +
` BEFORE INSERT on "${tableName}" for each row` +
` declare` +
` checking number := 1;` +
` begin` +
` if (:new."${to}" is null) then` +
` while checking >= 1 loop` +
` select "${sequenceName}".nextval into :new."${to}" from dual;` +
` select count("${to}") into checking from "${tableName}"` +
` where "${to}" = :new."${to}";` +
` end loop;` +
` end if;` +
` end;');` +
` end if;` +
` end if;` +
`END;`
);
}
createAutoIncrementTrigger(logger, tableName, schemaName) {
const tableQuoted = `"${tableName}"`;
const tableUnquoted = tableName;
const schemaQuoted = schemaName ? `"${schemaName}".` : '';
const constraintOwner = schemaName ? `'${schemaName}'` : 'cols.owner';
const triggerName = this.nameHelper.generateCombinedName(
logger,
'autoinc_trg',
tableName
);
const sequenceNameUnquoted = this.nameHelper.generateCombinedName(
logger,
'seq',
tableName
);
const sequenceNameQuoted = `"${sequenceNameUnquoted}"`;
return (
`DECLARE ` +
`PK_NAME VARCHAR(200); ` +
`BEGIN` +
` EXECUTE IMMEDIATE ('CREATE SEQUENCE ${schemaQuoted}${sequenceNameQuoted}');` +
` SELECT cols.column_name INTO PK_NAME` + // TODO : support autoincrement on table with multiple primary keys
` FROM all_constraints cons, all_cons_columns cols` +
` WHERE cons.constraint_type = 'P'` +
` AND cons.constraint_name = cols.constraint_name` +
` AND cons.owner = ${constraintOwner}` +
` AND cols.table_name = '${tableUnquoted}';` +
` execute immediate ('create or replace trigger ${schemaQuoted}"${triggerName}"` +
` BEFORE INSERT on ${schemaQuoted}${tableQuoted}` +
` for each row` +
` declare` +
` checking number := 1;` +
` begin` +
` if (:new."' || PK_NAME || '" is null) then` +
` while checking >= 1 loop` +
` select ${schemaQuoted}${sequenceNameQuoted}.nextval into :new."' || PK_NAME || '" from dual;` +
` select count("' || PK_NAME || '") into checking from ${schemaQuoted}${tableQuoted}` +
` where "' || PK_NAME || '" = :new."' || PK_NAME || '";` +
` end loop;` +
` end if;` +
` end;'); ` +
`END;`
);
}
renameTableAndAutoIncrementTrigger(logger, tableName, to) {
const triggerName = this.nameHelper.generateCombinedName(
logger,
'autoinc_trg',
tableName
);
const sequenceName = this.nameHelper.generateCombinedName(
logger,
'seq',
tableName
);
const toTriggerName = this.nameHelper.generateCombinedName(
logger,
'autoinc_trg',
to
);
const toSequenceName = this.nameHelper.generateCombinedName(
logger,
'seq',
to
);
return (
`DECLARE ` +
`PK_NAME VARCHAR(200); ` +
`IS_AUTOINC NUMBER := 0; ` +
`BEGIN` +
` EXECUTE IMMEDIATE ('RENAME "${tableName}" TO "${to}"');` +
` SELECT COUNT(*) INTO IS_AUTOINC from "USER_TRIGGERS" where trigger_name = '${triggerName}';` +
` IF (IS_AUTOINC > 0) THEN` +
` EXECUTE IMMEDIATE ('DROP TRIGGER "${triggerName}"');` +
` EXECUTE IMMEDIATE ('RENAME "${sequenceName}" TO "${toSequenceName}"');` +
` SELECT cols.column_name INTO PK_NAME` +
` FROM all_constraints cons, all_cons_columns cols` +
` WHERE cons.constraint_type = 'P'` +
` AND cons.constraint_name = cols.constraint_name` +
` AND cons.owner = cols.owner` +
` AND cols.table_name = '${to}';` +
` EXECUTE IMMEDIATE ('create or replace trigger "${toTriggerName}"` +
` BEFORE INSERT on "${to}" for each row` +
` declare` +
` checking number := 1;` +
` begin` +
` if (:new."' || PK_NAME || '" is null) then` +
` while checking >= 1 loop` +
` select "${toSequenceName}".nextval into :new."' || PK_NAME || '" from dual;` +
` select count("' || PK_NAME || '") into checking from "${to}"` +
` where "' || PK_NAME || '" = :new."' || PK_NAME || '";` +
` end loop;` +
` end if;` +
` end;');` +
` end if;` +
`END;`
);
}
}
module.exports = Trigger;

View File

@ -0,0 +1,17 @@
const ColumnBuilder = require('../../../schema/columnbuilder');
const toArray = require('lodash/toArray');
class ColumnBuilder_Oracle extends ColumnBuilder {
constructor() {
super(...arguments);
}
// checkIn added to the builder to allow the column compiler to change the
// order via the modifiers ("check" must be after "default")
checkIn() {
this._modifiers.checkIn = toArray(arguments);
return this;
}
}
module.exports = ColumnBuilder_Oracle;

View File

@ -0,0 +1,126 @@
const uniq = require('lodash/uniq');
const Raw = require('../../../raw');
const ColumnCompiler = require('../../../schema/columncompiler');
const {
createAutoIncrementTriggerAndSequence,
} = require('./internal/incrementUtils');
const { toNumber } = require('../../../util/helpers');
// Column Compiler
// -------
class ColumnCompiler_Oracle extends ColumnCompiler {
constructor() {
super(...arguments);
this.modifiers = ['defaultTo', 'checkIn', 'nullable', 'comment'];
}
increments(options = { primaryKey: true }) {
createAutoIncrementTriggerAndSequence(this);
return (
'integer not null' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
bigincrements(options = { primaryKey: true }) {
createAutoIncrementTriggerAndSequence(this);
return (
'number(20, 0) not null' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
floating(precision) {
const parsedPrecision = toNumber(precision, 0);
return `float${parsedPrecision ? `(${parsedPrecision})` : ''}`;
}
double(precision, scale) {
// if (!precision) return 'number'; // TODO: Check If default is ok
return `number(${toNumber(precision, 8)}, ${toNumber(scale, 2)})`;
}
decimal(precision, scale) {
if (precision === null) return 'decimal';
return `decimal(${toNumber(precision, 8)}, ${toNumber(scale, 2)})`;
}
integer(length) {
return length ? `number(${toNumber(length, 11)})` : 'integer';
}
enu(allowed) {
allowed = uniq(allowed);
const maxLength = (allowed || []).reduce(
(maxLength, name) => Math.max(maxLength, String(name).length),
1
);
// implicitly add the enum values as checked values
this.columnBuilder._modifiers.checkIn = [allowed];
return `varchar2(${maxLength})`;
}
datetime(without) {
return without ? 'timestamp' : 'timestamp with time zone';
}
timestamp(without) {
return without ? 'timestamp' : 'timestamp with time zone';
}
bool() {
// implicitly add the check for 0 and 1
this.columnBuilder._modifiers.checkIn = [[0, 1]];
return 'number(1, 0)';
}
varchar(length) {
return `varchar2(${toNumber(length, 255)})`;
}
// Modifiers
// ------
comment(comment) {
const columnName = this.args[0] || this.defaults('columnName');
this.pushAdditional(function () {
this.pushQuery(
`comment on column ${this.tableCompiler.tableName()}.` +
this.formatter.wrap(columnName) +
" is '" +
(comment || '') +
"'"
);
}, comment);
}
checkIn(value) {
// TODO: Maybe accept arguments also as array
// TODO: value(s) should be escaped properly
if (value === undefined) {
return '';
} else if (value instanceof Raw) {
value = value.toQuery();
} else if (Array.isArray(value)) {
value = value.map((v) => `'${v}'`).join(', ');
} else {
value = `'${value}'`;
}
return `check (${this.formatter.wrap(this.args[0])} in (${value}))`;
}
}
ColumnCompiler_Oracle.prototype.tinyint = 'smallint';
ColumnCompiler_Oracle.prototype.smallint = 'smallint';
ColumnCompiler_Oracle.prototype.mediumint = 'integer';
ColumnCompiler_Oracle.prototype.biginteger = 'number(20, 0)';
ColumnCompiler_Oracle.prototype.text = 'clob';
ColumnCompiler_Oracle.prototype.time = 'timestamp with time zone';
ColumnCompiler_Oracle.prototype.bit = 'clob';
ColumnCompiler_Oracle.prototype.json = 'clob';
module.exports = ColumnCompiler_Oracle;

View File

@ -0,0 +1,124 @@
// Oracle Schema Compiler
// -------
const SchemaCompiler = require('../../../schema/compiler');
const utils = require('../utils');
const Trigger = require('./internal/trigger');
class SchemaCompiler_Oracle extends SchemaCompiler {
constructor() {
super(...arguments);
}
// Rename a table on the schema.
renameTable(tableName, to) {
const trigger = new Trigger(this.client.version);
const renameTable = trigger.renameTableAndAutoIncrementTrigger(
this.client.logger,
tableName,
to
);
this.pushQuery(renameTable);
}
// Check whether a table exists on the query.
hasTable(tableName) {
this.pushQuery({
sql:
'select TABLE_NAME from USER_TABLES where TABLE_NAME = ' +
this.client.parameter(tableName, this.builder, this.bindingsHolder),
output(resp) {
return resp.length > 0;
},
});
}
// Check whether a column exists on the schema.
hasColumn(tableName, column) {
const sql =
`select COLUMN_NAME from ALL_TAB_COLUMNS ` +
`where TABLE_NAME = ${this.client.parameter(
tableName,
this.builder,
this.bindingsHolder
)} ` +
`and COLUMN_NAME = ${this.client.parameter(
column,
this.builder,
this.bindingsHolder
)}`;
this.pushQuery({ sql, output: (resp) => resp.length > 0 });
}
dropSequenceIfExists(sequenceName) {
const prefix = this.schema ? `"${this.schema}".` : '';
this.pushQuery(
utils.wrapSqlWithCatch(
`drop sequence ${prefix}${this.formatter.wrap(sequenceName)}`,
-2289
)
);
}
_dropRelatedSequenceIfExists(tableName) {
// removing the sequence that was possibly generated by increments() column
const nameHelper = new utils.NameHelper(this.client.version);
const sequenceName = nameHelper.generateCombinedName(
this.client.logger,
'seq',
tableName
);
this.dropSequenceIfExists(sequenceName);
}
dropTable(tableName) {
const prefix = this.schema ? `"${this.schema}".` : '';
this.pushQuery(`drop table ${prefix}${this.formatter.wrap(tableName)}`);
// removing the sequence that was possibly generated by increments() column
this._dropRelatedSequenceIfExists(tableName);
}
dropTableIfExists(tableName) {
this.dropObject(tableName, 'table');
}
dropViewIfExists(viewName) {
this.dropObject(viewName, 'view');
}
dropObject(objectName, type) {
const prefix = this.schema ? `"${this.schema}".` : '';
let errorCode = -942;
if (type === 'materialized view') {
// https://stackoverflow.com/a/1801453
errorCode = -12003;
}
this.pushQuery(
utils.wrapSqlWithCatch(
`drop ${type} ${prefix}${this.formatter.wrap(objectName)}`,
errorCode
)
);
// removing the sequence that was possibly generated by increments() column
this._dropRelatedSequenceIfExists(objectName);
}
refreshMaterializedView(viewName) {
return this.pushQuery({
sql: `BEGIN DBMS_MVIEW.REFRESH('${
this.schemaNameRaw ? this.schemaNameRaw + '.' : ''
}${viewName}'); END;`,
});
}
dropMaterializedView(viewName) {
this._dropView(viewName, false, true);
}
dropMaterializedViewIfExists(viewName) {
this.dropObject(viewName, 'materialized view');
}
}
module.exports = SchemaCompiler_Oracle;

View File

@ -0,0 +1,197 @@
/* eslint max-len:0 */
const utils = require('../utils');
const TableCompiler = require('../../../schema/tablecompiler');
const helpers = require('../../../util/helpers');
const Trigger = require('./internal/trigger');
const { isObject } = require('../../../util/is');
// Table Compiler
// ------
class TableCompiler_Oracle extends TableCompiler {
constructor() {
super(...arguments);
}
addColumns(columns, prefix) {
if (columns.sql.length > 0) {
prefix = prefix || this.addColumnsPrefix;
const columnSql = columns.sql;
const alter = this.lowerCase ? 'alter table ' : 'ALTER TABLE ';
let sql = `${alter}${this.tableName()} ${prefix}`;
if (columns.sql.length > 1) {
sql += `(${columnSql.join(', ')})`;
} else {
sql += columnSql.join(', ');
}
this.pushQuery({
sql,
bindings: columns.bindings,
});
}
}
// Compile a rename column command.
renameColumn(from, to) {
// Remove quotes around tableName
const tableName = this.tableName().slice(1, -1);
const trigger = new Trigger(this.client.version);
return this.pushQuery(
trigger.renameColumnTrigger(this.client.logger, tableName, from, to)
);
}
compileAdd(builder) {
const table = this.formatter.wrap(builder);
const columns = this.prefixArray('add column', this.getColumns(builder));
return this.pushQuery({
sql: `alter table ${table} ${columns.join(', ')}`,
});
}
// Adds the "create" query to the query sequence.
createQuery(columns, ifNot, like) {
const columnsSql =
like && this.tableNameLike()
? ' as (select * from ' + this.tableNameLike() + ' where 0=1)'
: ' (' + columns.sql.join(', ') + this._addChecks() + ')';
const sql = `create table ${this.tableName()}${columnsSql}`;
this.pushQuery({
// catch "name is already used by an existing object" for workaround for "if not exists"
sql: ifNot ? utils.wrapSqlWithCatch(sql, -955) : sql,
bindings: columns.bindings,
});
if (this.single.comment) this.comment(this.single.comment);
if (like) {
this.addColumns(columns, this.addColumnsPrefix);
}
}
// Compiles the comment on the table.
comment(comment) {
this.pushQuery(`comment on table ${this.tableName()} is '${comment}'`);
}
dropColumn() {
const columns = helpers.normalizeArr.apply(null, arguments);
this.pushQuery(
`alter table ${this.tableName()} drop (${this.formatter.columnize(
columns
)})`
);
}
_indexCommand(type, tableName, columns) {
const nameHelper = new utils.NameHelper(this.client.version);
return this.formatter.wrap(
nameHelper.generateCombinedName(
this.client.logger,
type,
tableName,
columns
)
);
}
primary(columns, constraintName) {
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
deferrable = deferrable ? ` deferrable initially ${deferrable}` : '';
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
const primaryCols = columns;
let incrementsCols = [];
if (this.grouped.columns) {
incrementsCols = this._getIncrementsColumnNames();
if (incrementsCols) {
incrementsCols.forEach((c) => {
if (!primaryCols.includes(c)) {
primaryCols.unshift(c);
}
});
}
}
this.pushQuery(
`alter table ${this.tableName()} add constraint ${constraintName} primary key (${this.formatter.columnize(
primaryCols
)})${deferrable}`
);
}
dropPrimary(constraintName) {
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(this.tableNameRaw + '_pkey');
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${constraintName}`
);
}
index(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
this.pushQuery(
`create index ${indexName} on ${this.tableName()}` +
' (' +
this.formatter.columnize(columns) +
')'
);
}
dropIndex(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
this.pushQuery(`drop index ${indexName}`);
}
unique(columns, indexName) {
let deferrable;
if (isObject(indexName)) {
({ indexName, deferrable } = indexName);
}
deferrable = deferrable ? ` deferrable initially ${deferrable}` : '';
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} add constraint ${indexName}` +
' unique (' +
this.formatter.columnize(columns) +
')' +
deferrable
);
}
dropUnique(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${indexName}`
);
}
dropForeign(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('foreign', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${indexName}`
);
}
}
TableCompiler_Oracle.prototype.addColumnsPrefix = 'add ';
TableCompiler_Oracle.prototype.alterColumnsPrefix = 'modify ';
module.exports = TableCompiler_Oracle;

View File

@ -0,0 +1,106 @@
class NameHelper {
constructor(oracleVersion) {
this.oracleVersion = oracleVersion;
// In oracle versions prior to 12.2, the maximum length for a database
// object name was 30 characters. 12.2 extended this to 128.
const versionParts = oracleVersion
.split('.')
.map((versionPart) => parseInt(versionPart));
if (
versionParts[0] > 12 ||
(versionParts[0] === 12 && versionParts[1] >= 2)
) {
this.limit = 128;
} else {
this.limit = 30;
}
}
generateCombinedName(logger, postfix, name, subNames) {
const crypto = require('crypto');
if (!Array.isArray(subNames)) subNames = subNames ? [subNames] : [];
const table = name.replace(/\.|-/g, '_');
const subNamesPart = subNames.join('_');
let result = `${table}_${
subNamesPart.length ? subNamesPart + '_' : ''
}${postfix}`.toLowerCase();
if (result.length > this.limit) {
logger.warn(
`Automatically generated name "${result}" exceeds ${this.limit} character ` +
`limit for Oracle Database ${this.oracleVersion}. Using base64 encoded sha1 of that name instead.`
);
// generates the sha1 of the name and encode it with base64
result = crypto
.createHash('sha1')
.update(result)
.digest('base64')
.replace('=', '');
}
return result;
}
}
function wrapSqlWithCatch(sql, errorNumberToCatch) {
return (
`begin execute immediate '${sql.replace(/'/g, "''")}'; ` +
`exception when others then if sqlcode != ${errorNumberToCatch} then raise; ` +
`end if; ` +
`end;`
);
}
function ReturningHelper(columnName) {
this.columnName = columnName;
}
ReturningHelper.prototype.toString = function () {
return `[object ReturningHelper:${this.columnName}]`;
};
// If the error is any of these, we'll assume we need to
// mark the connection as failed
function isConnectionError(err) {
return [
'DPI-1010', // not connected
'DPI-1080', // connection was closed by ORA-%d
'ORA-03114', // not connected to ORACLE
'ORA-03113', // end-of-file on communication channel
'ORA-03135', // connection lost contact
'ORA-12514', // listener does not currently know of service requested in connect descriptor
'ORA-00022', // invalid session ID; access denied
'ORA-00028', // your session has been killed
'ORA-00031', // your session has been marked for kill
'ORA-00045', // your session has been terminated with no replay
'ORA-00378', // buffer pools cannot be created as specified
'ORA-00602', // internal programming exception
'ORA-00603', // ORACLE server session terminated by fatal error
'ORA-00609', // could not attach to incoming connection
'ORA-01012', // not logged on
'ORA-01041', // internal error. hostdef extension doesn't exist
'ORA-01043', // user side memory corruption
'ORA-01089', // immediate shutdown or close in progress
'ORA-01092', // ORACLE instance terminated. Disconnection forced
'ORA-02396', // exceeded maximum idle time, please connect again
'ORA-03122', // attempt to close ORACLE-side window on user side
'ORA-12153', // TNS'not connected
'ORA-12537', // TNS'connection closed
'ORA-12547', // TNS'lost contact
'ORA-12570', // TNS'packet reader failure
'ORA-12583', // TNS'no reader
'ORA-27146', // post/wait initialization failed
'ORA-28511', // lost RPC connection
'ORA-56600', // an illegal OCI function call was issued
'NJS-024',
'NJS-003',
].some(function (prefix) {
return err.message.indexOf(prefix) === 0;
});
}
module.exports = {
NameHelper,
isConnectionError,
wrapSqlWithCatch,
ReturningHelper,
};

View File

@ -0,0 +1,381 @@
// Oracledb Client
// -------
const each = require('lodash/each');
const flatten = require('lodash/flatten');
const isEmpty = require('lodash/isEmpty');
const map = require('lodash/map');
const Formatter = require('../../formatter');
const QueryCompiler = require('./query/oracledb-querycompiler');
const TableCompiler = require('./schema/oracledb-tablecompiler');
const ColumnCompiler = require('./schema/oracledb-columncompiler');
const {
BlobHelper,
ReturningHelper,
monkeyPatchConnection,
} = require('./utils');
const ViewCompiler = require('./schema/oracledb-viewcompiler');
const ViewBuilder = require('./schema/oracledb-viewbuilder');
const Transaction = require('./transaction');
const Client_Oracle = require('../oracle');
const { isString } = require('../../util/is');
const { outputQuery, unwrapRaw } = require('../../formatter/wrappingFormatter');
const { compileCallback } = require('../../formatter/formatterUtils');
class Client_Oracledb extends Client_Oracle {
constructor(config) {
super(config);
if (this.version) {
// Normalize version format; null bad format
// to trigger fallback to auto-detect.
this.version = parseVersion(this.version);
}
if (this.driver) {
process.env.UV_THREADPOOL_SIZE = process.env.UV_THREADPOOL_SIZE || 1;
process.env.UV_THREADPOOL_SIZE =
parseInt(process.env.UV_THREADPOOL_SIZE) + this.driver.poolMax;
}
}
_driver() {
const client = this;
const oracledb = require('oracledb');
client.fetchAsString = [];
if (this.config.fetchAsString && Array.isArray(this.config.fetchAsString)) {
this.config.fetchAsString.forEach(function (type) {
if (!isString(type)) return;
type = type.toUpperCase();
if (oracledb[type]) {
if (
type !== 'NUMBER' &&
type !== 'DATE' &&
type !== 'CLOB' &&
type !== 'BUFFER'
) {
this.logger.warn(
'Only "date", "number", "clob" and "buffer" are supported for fetchAsString'
);
}
client.fetchAsString.push(oracledb[type]);
}
});
}
return oracledb;
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
viewBuilder() {
return new ViewBuilder(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
formatter(builder) {
return new Formatter(this, builder);
}
transaction() {
return new Transaction(this, ...arguments);
}
prepBindings(bindings) {
return map(bindings, (value) => {
if (value instanceof BlobHelper && this.driver) {
return { type: this.driver.BLOB, dir: this.driver.BIND_OUT };
// Returning helper always use ROWID as string
} else if (value instanceof ReturningHelper && this.driver) {
return { type: this.driver.STRING, dir: this.driver.BIND_OUT };
} else if (typeof value === 'boolean') {
return value ? 1 : 0;
}
return value;
});
}
// Checks whether a value is a function... if it is, we compile it
// otherwise we check whether it's a raw
parameter(value, builder, formatter) {
if (typeof value === 'function') {
return outputQuery(
compileCallback(value, undefined, this, formatter),
true,
builder,
this
);
} else if (value instanceof BlobHelper) {
formatter.bindings.push(value.value);
return '?';
}
return unwrapRaw(value, true, builder, this, formatter) || '?';
}
// Get a raw connection, called by the `pool` whenever a new
// connection needs to be added to the pool.
acquireRawConnection() {
return new Promise((resolver, rejecter) => {
// If external authentication don't have to worry about username/password and
// if not need to set the username and password
const oracleDbConfig = this.connectionSettings.externalAuth
? { externalAuth: this.connectionSettings.externalAuth }
: {
user: this.connectionSettings.user,
password: this.connectionSettings.password,
};
// In the case of external authentication connection string will be given
oracleDbConfig.connectString = resolveConnectString(
this.connectionSettings
);
if (this.connectionSettings.prefetchRowCount) {
oracleDbConfig.prefetchRows = this.connectionSettings.prefetchRowCount;
}
if (this.connectionSettings.stmtCacheSize !== undefined) {
oracleDbConfig.stmtCacheSize = this.connectionSettings.stmtCacheSize;
}
this.driver.fetchAsString = this.fetchAsString;
this.driver.getConnection(oracleDbConfig, (err, connection) => {
if (err) {
return rejecter(err);
}
monkeyPatchConnection(connection, this);
resolver(connection);
});
});
}
// Used to explicitly close a connection, called internally by the pool
// when a connection times out or the pool is shutdown.
destroyRawConnection(connection) {
return connection.release();
}
// Handle oracle version resolution on acquiring connection from pool instead of connection creation.
// Must do this here since only the client used to create a connection would be updated with version
// information on creation. Poses a problem when knex instance is cloned since instances share the
// connection pool while having their own client instances.
async acquireConnection() {
const connection = await super.acquireConnection();
this.checkVersion(connection);
return connection;
}
// In Oracle, we need to check the version to dynamically determine
// certain limits. If user did not specify a version, get it from the connection.
checkVersion(connection) {
// Already determined version before?
if (this.version) {
return this.version;
}
const detectedVersion = parseVersion(connection.oracleServerVersionString);
if (!detectedVersion) {
// When original version is set to null, user-provided version was invalid and we fell-back to auto-detect.
// Otherwise, we couldn't auto-detect at all. Set error message accordingly.
throw new Error(
this.version === null
? 'Invalid Oracledb version number format passed to knex. Unable to successfully auto-detect as fallback. Please specify a valid oracledb version.'
: 'Unable to detect Oracledb version number automatically. Please specify the version in knex configuration.'
);
}
this.version = detectedVersion;
return detectedVersion;
}
// Runs the query on the specified connection, providing the bindings
// and any other necessary prep work.
_query(connection, obj) {
if (!obj.sql) throw new Error('The query is empty');
const options = Object.assign({}, obj.options, { autoCommit: false });
if (obj.method === 'select') {
options.resultSet = true;
}
return connection
.executeAsync(obj.sql, obj.bindings, options)
.then(async function (response) {
// Flatten outBinds
let outBinds = flatten(response.outBinds);
obj.response = response.rows || [];
obj.rowsAffected = response.rows
? response.rows.rowsAffected
: response.rowsAffected;
//added for outBind parameter
if (obj.method === 'raw' && outBinds.length > 0) {
return {
response: outBinds,
};
}
if (obj.method === 'update') {
const modifiedRowsCount = obj.rowsAffected.length || obj.rowsAffected;
const updatedObjOutBinding = [];
const updatedOutBinds = [];
const updateOutBinds = (i) =>
function (value, index) {
const OutBindsOffset = index * modifiedRowsCount;
updatedOutBinds.push(outBinds[i + OutBindsOffset]);
};
for (let i = 0; i < modifiedRowsCount; i++) {
updatedObjOutBinding.push(obj.outBinding[0]);
each(obj.outBinding[0], updateOutBinds(i));
}
outBinds = updatedOutBinds;
obj.outBinding = updatedObjOutBinding;
}
if (!obj.returning && outBinds.length === 0) {
if (!connection.isTransaction) {
await connection.commitAsync();
}
return obj;
}
const rowIds = [];
let offset = 0;
for (let line = 0; line < obj.outBinding.length; line++) {
const ret = obj.outBinding[line];
offset =
offset +
(obj.outBinding[line - 1] ? obj.outBinding[line - 1].length : 0);
for (let index = 0; index < ret.length; index++) {
const out = ret[index];
await new Promise(function (bindResolver, bindRejecter) {
if (out instanceof BlobHelper) {
const blob = outBinds[index + offset];
if (out.returning) {
obj.response[line] = obj.response[line] || {};
obj.response[line][out.columnName] = out.value;
}
blob.on('error', function (err) {
bindRejecter(err);
});
blob.on('finish', function () {
bindResolver();
});
blob.write(out.value);
blob.end();
} else if (obj.outBinding[line][index] === 'ROWID') {
rowIds.push(outBinds[index + offset]);
bindResolver();
} else {
obj.response[line] = obj.response[line] || {};
obj.response[line][out] = outBinds[index + offset];
bindResolver();
}
});
}
}
if (obj.returningSql) {
const response = await connection.executeAsync(
obj.returningSql(),
rowIds,
{ resultSet: true }
);
obj.response = response.rows;
}
if (connection.isTransaction) {
return obj;
}
await connection.commitAsync();
return obj;
});
}
// Process the response as returned from the query.
processResponse(obj, runner) {
const { response } = obj;
if (obj.output) {
return obj.output.call(runner, response);
}
switch (obj.method) {
case 'select':
return response;
case 'first':
return response[0];
case 'pluck':
return map(response, obj.pluck);
case 'insert':
case 'del':
case 'update':
case 'counter':
if ((obj.returning && !isEmpty(obj.returning)) || obj.returningSql) {
return response;
} else if (obj.rowsAffected !== undefined) {
return obj.rowsAffected;
} else {
return 1;
}
default:
return response;
}
}
processPassedConnection(connection) {
this.checkVersion(connection);
monkeyPatchConnection(connection, this);
}
}
Client_Oracledb.prototype.driverName = 'oracledb';
function parseVersion(versionString) {
try {
// We only care about first two version components at most
const versionParts = versionString.split('.').slice(0, 2);
// Strip off any character suffixes in version number (ex. 12c => 12, 12.2c => 12.2)
versionParts.forEach((versionPart, idx) => {
versionParts[idx] = versionPart.replace(/\D$/, '');
});
const version = versionParts.join('.');
return version.match(/^\d+\.?\d*$/) ? version : null;
} catch (err) {
// Non-string versionString passed in.
return null;
}
}
function resolveConnectString(connectionSettings) {
if (connectionSettings.connectString) {
return connectionSettings.connectString;
}
if (!connectionSettings.port) {
return connectionSettings.host + '/' + connectionSettings.database;
}
return (
connectionSettings.host +
':' +
connectionSettings.port +
'/' +
connectionSettings.database
);
}
module.exports = Client_Oracledb;

View File

@ -0,0 +1,481 @@
const clone = require('lodash/clone');
const each = require('lodash/each');
const isEmpty = require('lodash/isEmpty');
const isPlainObject = require('lodash/isPlainObject');
const Oracle_Compiler = require('../../oracle/query/oracle-querycompiler');
const ReturningHelper = require('../utils').ReturningHelper;
const BlobHelper = require('../utils').BlobHelper;
const { isString } = require('../../../util/is');
const {
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
class Oracledb_Compiler extends Oracle_Compiler {
// Compiles an "insert" query, allowing for multiple
// inserts using a single query statement.
insert() {
const self = this;
const outBindPrep = this._prepOutbindings(
this.single.insert,
this.single.returning
);
const outBinding = outBindPrep.outBinding;
const returning = outBindPrep.returning;
const insertValues = outBindPrep.values;
if (
Array.isArray(insertValues) &&
insertValues.length === 1 &&
isEmpty(insertValues[0])
) {
const returningFragment = this.single.returning
? ' (' + this.formatter.wrap(this.single.returning) + ')'
: '';
return this._addReturningToSqlAndConvert(
'insert into ' +
this.tableName +
returningFragment +
' values (default)',
outBinding[0],
this.tableName,
returning
);
}
if (
isEmpty(this.single.insert) &&
typeof this.single.insert !== 'function'
) {
return '';
}
const insertData = this._prepInsert(insertValues);
const sql = {};
if (isString(insertData)) {
return this._addReturningToSqlAndConvert(
'insert into ' + this.tableName + ' ' + insertData,
outBinding[0],
this.tableName,
returning
);
}
if (insertData.values.length === 1) {
return this._addReturningToSqlAndConvert(
'insert into ' +
this.tableName +
' (' +
this.formatter.columnize(insertData.columns) +
') values (' +
this.client.parameterize(
insertData.values[0],
undefined,
this.builder,
this.bindingsHolder
) +
')',
outBinding[0],
this.tableName,
returning
);
}
const insertDefaultsOnly = insertData.columns.length === 0;
sql.returning = returning;
sql.sql =
'begin ' +
insertData.values
.map(function (value, index) {
const parameterizedValues = !insertDefaultsOnly
? self.client.parameterize(
value,
self.client.valueForUndefined,
self.builder,
self.bindingsHolder
)
: '';
let subSql = 'insert into ' + self.tableName;
if (insertDefaultsOnly) {
// No columns given so only the default value
subSql +=
' (' +
self.formatter.wrap(self.single.returning) +
') values (default)';
} else {
subSql +=
' (' +
self.formatter.columnize(insertData.columns) +
') values (' +
parameterizedValues +
')';
}
let returningClause = '';
let intoClause = '';
// ToDo review if this code is still needed or could be dropped
// eslint-disable-next-line no-unused-vars
let usingClause = '';
let outClause = '';
each(value, function (val) {
if (!(val instanceof BlobHelper)) {
usingClause += ' ?,';
}
});
// eslint-disable-next-line no-unused-vars
usingClause = usingClause.slice(0, -1);
// Build returning and into clauses
outBinding[index].forEach(function (ret) {
const columnName = ret.columnName || ret;
returningClause += self.formatter.wrap(columnName) + ',';
intoClause += ' ?,';
outClause += ' out ?,';
// Add Helpers to bindings
if (ret instanceof BlobHelper) {
return self.formatter.bindings.push(ret);
}
self.formatter.bindings.push(new ReturningHelper(columnName));
});
// Strip last comma
returningClause = returningClause.slice(0, -1);
intoClause = intoClause.slice(0, -1);
outClause = outClause.slice(0, -1);
if (returningClause && intoClause) {
subSql += ' returning ' + returningClause + ' into' + intoClause;
}
// Pre bind position because subSql is an execute immediate parameter
// later position binding will only convert the ? params
subSql = self.formatter.client.positionBindings(subSql);
const parameterizedValuesWithoutDefaultAndBlob = parameterizedValues
.replace(/DEFAULT, /g, '')
.replace(/, DEFAULT/g, '')
.replace('EMPTY_BLOB(), ', '')
.replace(', EMPTY_BLOB()', '');
return (
"execute immediate '" +
subSql.replace(/'/g, "''") +
(parameterizedValuesWithoutDefaultAndBlob || value
? "' using "
: '') +
parameterizedValuesWithoutDefaultAndBlob +
(parameterizedValuesWithoutDefaultAndBlob && outClause ? ',' : '') +
outClause +
';'
);
})
.join(' ') +
'end;';
sql.outBinding = outBinding;
if (returning[0] === '*') {
// Generate select statement with special order by
// to keep the order because 'in (..)' may change the order
sql.returningSql = function () {
return (
'select * from ' +
self.tableName +
' where ROWID in (' +
this.outBinding
.map(function (v, i) {
return ':' + (i + 1);
})
.join(', ') +
')' +
' order by case ROWID ' +
this.outBinding
.map(function (v, i) {
return 'when CHARTOROWID(:' + (i + 1) + ') then ' + i;
})
.join(' ') +
' end'
);
};
}
return sql;
}
with() {
// WITH RECURSIVE is a syntax error in Oracle SQL.
// So mark all statements as non-recursive, generate the SQL, then restore.
// This approach ensures any changes in base class with() get propagated here.
const undoList = [];
if (this.grouped.with) {
for (const stmt of this.grouped.with) {
if (stmt.recursive) {
undoList.push(stmt);
stmt.recursive = false;
}
}
}
const result = super.with();
// Restore the recursive markings, in case this same query gets cloned and passed to other drivers.
for (const stmt of undoList) {
stmt.recursive = true;
}
return result;
}
_addReturningToSqlAndConvert(sql, outBinding, tableName, returning) {
const self = this;
const res = {
sql: sql,
};
if (!outBinding) {
return res;
}
const returningValues = Array.isArray(outBinding)
? outBinding
: [outBinding];
let returningClause = '';
let intoClause = '';
// Build returning and into clauses
returningValues.forEach(function (ret) {
const columnName = ret.columnName || ret;
returningClause += self.formatter.wrap(columnName) + ',';
intoClause += '?,';
// Add Helpers to bindings
if (ret instanceof BlobHelper) {
return self.formatter.bindings.push(ret);
}
self.formatter.bindings.push(new ReturningHelper(columnName));
});
res.sql = sql;
// Strip last comma
returningClause = returningClause.slice(0, -1);
intoClause = intoClause.slice(0, -1);
if (returningClause && intoClause) {
res.sql += ' returning ' + returningClause + ' into ' + intoClause;
}
res.outBinding = [outBinding];
if (returning[0] === '*') {
res.returningSql = function () {
return 'select * from ' + self.tableName + ' where ROWID = :1';
};
}
res.returning = returning;
return res;
}
_prepOutbindings(paramValues, paramReturning) {
const result = {};
let params = paramValues || [];
let returning = paramReturning || [];
if (!Array.isArray(params) && isPlainObject(paramValues)) {
params = [params];
}
// Always wrap returning argument in array
if (returning && !Array.isArray(returning)) {
returning = [returning];
}
const outBinding = [];
// Handle Buffer value as Blob
each(params, function (values, index) {
if (returning[0] === '*') {
outBinding[index] = ['ROWID'];
} else {
outBinding[index] = clone(returning);
}
each(values, function (value, key) {
if (value instanceof Buffer) {
values[key] = new BlobHelper(key, value);
// Delete blob duplicate in returning
const blobIndex = outBinding[index].indexOf(key);
if (blobIndex >= 0) {
outBinding[index].splice(blobIndex, 1);
values[key].returning = true;
}
outBinding[index].push(values[key]);
}
if (value === undefined) {
delete params[index][key];
}
});
});
result.returning = returning;
result.outBinding = outBinding;
result.values = params;
return result;
}
_groupOrder(item, type) {
return super._groupOrderNulls(item, type);
}
update() {
const self = this;
const sql = {};
const outBindPrep = this._prepOutbindings(
this.single.update || this.single.counter,
this.single.returning
);
const outBinding = outBindPrep.outBinding;
const returning = outBindPrep.returning;
const updates = this._prepUpdate(this.single.update);
const where = this.where();
let returningClause = '';
let intoClause = '';
if (isEmpty(updates) && typeof this.single.update !== 'function') {
return '';
}
// Build returning and into clauses
outBinding.forEach(function (out) {
out.forEach(function (ret) {
const columnName = ret.columnName || ret;
returningClause += self.formatter.wrap(columnName) + ',';
intoClause += ' ?,';
// Add Helpers to bindings
if (ret instanceof BlobHelper) {
return self.formatter.bindings.push(ret);
}
self.formatter.bindings.push(new ReturningHelper(columnName));
});
});
// Strip last comma
returningClause = returningClause.slice(0, -1);
intoClause = intoClause.slice(0, -1);
sql.outBinding = outBinding;
sql.returning = returning;
sql.sql =
'update ' +
this.tableName +
' set ' +
updates.join(', ') +
(where ? ' ' + where : '');
if (outBinding.length && !isEmpty(outBinding[0])) {
sql.sql += ' returning ' + returningClause + ' into' + intoClause;
}
if (returning[0] === '*') {
sql.returningSql = function () {
let sql = 'select * from ' + self.tableName;
const modifiedRowsCount = this.rowsAffected.length || this.rowsAffected;
let returningSqlIn = ' where ROWID in (';
let returningSqlOrderBy = ') order by case ROWID ';
// Needs special order by because in(...) change result order
for (let i = 0; i < modifiedRowsCount; i++) {
if (this.returning[0] === '*') {
returningSqlIn += ':' + (i + 1) + ', ';
returningSqlOrderBy +=
'when CHARTOROWID(:' + (i + 1) + ') then ' + i + ' ';
}
}
if (this.returning[0] === '*') {
this.returning = this.returning.slice(0, -1);
returningSqlIn = returningSqlIn.slice(0, -2);
returningSqlOrderBy = returningSqlOrderBy.slice(0, -1);
}
return (sql += returningSqlIn + returningSqlOrderBy + ' end');
};
}
return sql;
}
_jsonPathWrap(extraction) {
return `'${extraction.path || extraction[1]}'`;
}
// Json functions
jsonExtract(params) {
return this._jsonExtract(
params.singleValue ? 'json_value' : 'json_query',
params
);
}
jsonSet(params) {
return `json_transform(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)}, set ${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)} = ${this.client.parameter(
params.value,
this.builder,
this.bindingsHolder
)})`;
}
jsonInsert(params) {
return `json_transform(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)}, insert ${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)} = ${this.client.parameter(
params.value,
this.builder,
this.bindingsHolder
)})`;
}
jsonRemove(params) {
const jsonCol = `json_transform(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)}, remove ${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)})`;
return params.alias
? this.client.alias(jsonCol, this.formatter.wrap(params.alias))
: jsonCol;
}
whereJsonPath(statement) {
return this._whereJsonPath('json_value', statement);
}
whereJsonSupersetOf(statement) {
throw new Error(
'Json superset where clause not actually supported by Oracle'
);
}
whereJsonSubsetOf(statement) {
throw new Error(
'Json subset where clause not actually supported by Oracle'
);
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('json_value', clause);
}
}
module.exports = Oracledb_Compiler;

View File

@ -0,0 +1,61 @@
const ColumnCompiler_Oracle = require('../../oracle/schema/oracle-columncompiler');
const { isObject } = require('../../../util/is');
class ColumnCompiler_Oracledb extends ColumnCompiler_Oracle {
constructor() {
super(...arguments);
this.modifiers = ['defaultTo', 'nullable', 'comment', 'checkJson'];
this._addCheckModifiers();
}
datetime(withoutTz) {
let useTz;
if (isObject(withoutTz)) {
({ useTz } = withoutTz);
} else {
useTz = !withoutTz;
}
return useTz ? 'timestamp with local time zone' : 'timestamp';
}
timestamp(withoutTz) {
let useTz;
if (isObject(withoutTz)) {
({ useTz } = withoutTz);
} else {
useTz = !withoutTz;
}
return useTz ? 'timestamp with local time zone' : 'timestamp';
}
checkRegex(regex, constraintName) {
return this._check(
`REGEXP_LIKE(${this.formatter.wrap(
this.getColumnName()
)},${this.client._escapeBinding(regex)})`,
constraintName
);
}
json() {
// implicitly add the check for json
this.columnBuilder._modifiers.checkJson = [
this.formatter.columnize(this.getColumnName()),
];
return 'varchar2(4000)';
}
jsonb() {
return this.json();
}
checkJson(column) {
return `check (${column} is json)`;
}
}
ColumnCompiler_Oracledb.prototype.time = 'timestamp with local time zone';
ColumnCompiler_Oracledb.prototype.uuid = ({ useBinaryUuid = false } = {}) =>
useBinaryUuid ? 'raw(16)' : 'char(36)';
module.exports = ColumnCompiler_Oracledb;

View File

@ -0,0 +1,19 @@
const TableCompiler_Oracle = require('../../oracle/schema/oracle-tablecompiler');
class TableCompiler_Oracledb extends TableCompiler_Oracle {
constructor(client, tableBuilder) {
super(client, tableBuilder);
}
_setNullableState(column, isNullable) {
const nullability = isNullable ? 'NULL' : 'NOT NULL';
const sql = `alter table ${this.tableName()} modify (${this.formatter.wrap(
column
)} ${nullability})`;
return this.pushQuery({
sql: sql,
});
}
}
module.exports = TableCompiler_Oracledb;

View File

@ -0,0 +1,13 @@
const ViewBuilder = require('../../../schema/viewbuilder.js');
class ViewBuilder_Oracledb extends ViewBuilder {
constructor() {
super(...arguments);
}
checkOption() {
this._single.checkOption = 'default_option';
}
}
module.exports = ViewBuilder_Oracledb;

View File

@ -0,0 +1,19 @@
/* eslint max-len: 0 */
const ViewCompiler = require('../../../schema/viewcompiler.js');
class ViewCompiler_Oracledb extends ViewCompiler {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
createOrReplace() {
this.createQuery(this.columns, this.selectQuery, false, true);
}
createMaterializedView() {
this.createQuery(this.columns, this.selectQuery, true);
}
}
module.exports = ViewCompiler_Oracledb;

View File

@ -0,0 +1,98 @@
const Transaction = require('../../execution/transaction');
const { timeout, KnexTimeoutError } = require('../../util/timeout');
const debugTx = require('debug')('knex:tx');
// There's also a "read only", but that's not really an "isolationLevel"
const supportedIsolationLevels = ['read committed', 'serializable'];
// Remove this if you make it work and set it to true
const isIsolationLevelEnabled = false;
module.exports = class Oracle_Transaction extends Transaction {
// disable autocommit to allow correct behavior (default is true)
begin(conn) {
if (this.isolationLevel) {
if (isIsolationLevelEnabled) {
if (!supportedIsolationLevels.includes(this.isolationLevel)) {
this.client.logger.warn(
'Oracle only supports read committed and serializable transactions, ignoring the isolation level param'
);
} else {
// I tried this, but it didn't work
// Doc here: https://docs.oracle.com/en/database/oracle/oracle-database/19/sqlrf/SET-TRANSACTION.html
return this.query(conn, `SET TRANSACTION ${this.isolationLevel}`);
}
} else {
this.client.logger.warn(
'Transaction isolation is not currently supported for Oracle'
);
}
}
return Promise.resolve();
}
async commit(conn, value) {
this._completed = true;
try {
await conn.commitAsync();
this._resolver(value);
} catch (err) {
this._rejecter(err);
}
}
release(conn, value) {
return this._resolver(value);
}
rollback(conn, err) {
this._completed = true;
debugTx('%s: rolling back', this.txid);
return timeout(conn.rollbackAsync(), 5000)
.catch((e) => {
if (!(e instanceof KnexTimeoutError)) {
return Promise.reject(e);
}
this._rejecter(e);
})
.then(() => {
if (err === undefined) {
if (this.doNotRejectOnRollback) {
this._resolver();
return;
}
err = new Error(`Transaction rejected with non-error: ${err}`);
}
this._rejecter(err);
});
}
savepoint(conn) {
return this.query(conn, `SAVEPOINT ${this.txid}`);
}
async acquireConnection(config, cb) {
const configConnection = config && config.connection;
const connection =
configConnection || (await this.client.acquireConnection());
try {
connection.__knexTxId = this.txid;
connection.isTransaction = true;
return await cb(connection);
} finally {
debugTx('%s: releasing connection', this.txid);
connection.isTransaction = false;
try {
await connection.commitAsync();
} catch (err) {
this._rejecter(err);
} finally {
if (!configConnection) {
await this.client.releaseConnection(connection);
} else {
debugTx('%s: not releasing external connection', this.txid);
}
}
}
}
};

View File

@ -0,0 +1,208 @@
const Utils = require('../oracle/utils');
const { promisify } = require('util');
const stream = require('stream');
function BlobHelper(columnName, value) {
this.columnName = columnName;
this.value = value;
this.returning = false;
}
BlobHelper.prototype.toString = function () {
return '[object BlobHelper:' + this.columnName + ']';
};
/**
* @param stream
* @param {'string' | 'buffer'} type
*/
function readStream(stream, type) {
return new Promise((resolve, reject) => {
let data = type === 'string' ? '' : Buffer.alloc(0);
stream.on('error', function (err) {
reject(err);
});
stream.on('data', function (chunk) {
if (type === 'string') {
data += chunk;
} else {
data = Buffer.concat([data, chunk]);
}
});
stream.on('end', function () {
resolve(data);
});
});
}
const lobProcessing = function (stream) {
const oracledb = require('oracledb');
/**
* @type 'string' | 'buffer'
*/
let type;
if (stream.type) {
// v1.2-v4
if (stream.type === oracledb.BLOB) {
type = 'buffer';
} else if (stream.type === oracledb.CLOB) {
type = 'string';
}
} else if (stream.iLob) {
// v1
if (stream.iLob.type === oracledb.CLOB) {
type = 'string';
} else if (stream.iLob.type === oracledb.BLOB) {
type = 'buffer';
}
} else {
throw new Error('Unrecognized oracledb lob stream type');
}
if (type === 'string') {
stream.setEncoding('utf-8');
}
return readStream(stream, type);
};
function monkeyPatchConnection(connection, client) {
// Connection is already monkey-patched
if (connection.executeAsync) {
return;
}
connection.commitAsync = function () {
return new Promise((commitResolve, commitReject) => {
this.commit(function (err) {
if (err) {
return commitReject(err);
}
commitResolve();
});
});
};
connection.rollbackAsync = function () {
return new Promise((rollbackResolve, rollbackReject) => {
this.rollback(function (err) {
if (err) {
return rollbackReject(err);
}
rollbackResolve();
});
});
};
const fetchAsync = promisify(function (sql, bindParams, options, cb) {
options = options || {};
options.outFormat = client.driver.OUT_FORMAT_OBJECT || client.driver.OBJECT;
if (!options.outFormat) {
throw new Error('not found oracledb.outFormat constants');
}
if (options.resultSet) {
connection.execute(
sql,
bindParams || [],
options,
function (err, result) {
if (err) {
if (Utils.isConnectionError(err)) {
connection.close().catch(function (err) {});
connection.__knex__disposed = err;
}
return cb(err);
}
const fetchResult = { rows: [], resultSet: result.resultSet };
const numRows = 100;
const fetchRowsFromRS = function (connection, resultSet, numRows) {
resultSet.getRows(numRows, function (err, rows) {
if (err) {
if (Utils.isConnectionError(err)) {
connection.close().catch(function (err) {});
connection.__knex__disposed = err;
}
resultSet.close(function () {
return cb(err);
});
} else if (rows.length === 0) {
return cb(null, fetchResult);
} else if (rows.length > 0) {
if (rows.length === numRows) {
fetchResult.rows = fetchResult.rows.concat(rows);
fetchRowsFromRS(connection, resultSet, numRows);
} else {
fetchResult.rows = fetchResult.rows.concat(rows);
return cb(null, fetchResult);
}
}
});
};
fetchRowsFromRS(connection, result.resultSet, numRows);
}
);
} else {
connection.execute(
sql,
bindParams || [],
options,
function (err, result) {
if (err) {
// dispose the connection on connection error
if (Utils.isConnectionError(err)) {
connection.close().catch(function (err) {});
connection.__knex__disposed = err;
}
return cb(err);
}
return cb(null, result);
}
);
}
});
connection.executeAsync = function (sql, bindParams, options) {
// Read all lob
return fetchAsync(sql, bindParams, options).then(async (results) => {
const closeResultSet = () => {
return results.resultSet
? promisify(results.resultSet.close).call(results.resultSet)
: Promise.resolve();
};
// Collect LOBs to read
const lobs = [];
if (results.rows) {
if (Array.isArray(results.rows)) {
for (let i = 0; i < results.rows.length; i++) {
// Iterate through the rows
const row = results.rows[i];
for (const column in row) {
if (row[column] instanceof stream.Readable) {
lobs.push({ index: i, key: column, stream: row[column] });
}
}
}
}
}
try {
for (const lob of lobs) {
// todo should be fetchAsString/fetchAsBuffer polyfill only
results.rows[lob.index][lob.key] = await lobProcessing(lob.stream);
}
} catch (e) {
await closeResultSet().catch(() => {});
throw e;
}
await closeResultSet();
return results;
});
};
}
Utils.BlobHelper = BlobHelper;
Utils.monkeyPatchConnection = monkeyPatchConnection;
module.exports = Utils;

View File

@ -0,0 +1,60 @@
// PostgreSQL Native Driver (pg-native)
// -------
const Client_PG = require('../postgres');
class Client_PgNative extends Client_PG {
constructor(...args) {
super(...args);
this.driverName = 'pgnative';
this.canCancelQuery = true;
}
_driver() {
return require('pg').native;
}
_stream(connection, obj, stream, options) {
if (!obj.sql) throw new Error('The query is empty');
const client = this;
return new Promise((resolver, rejecter) => {
stream.on('error', rejecter);
stream.on('end', resolver);
return client
._query(connection, obj)
.then((obj) => obj.response)
.then(({ rows }) => rows.forEach((row) => stream.write(row)))
.catch(function (err) {
stream.emit('error', err);
})
.then(function () {
stream.end();
});
});
}
async cancelQuery(connectionToKill) {
try {
return await this._wrappedCancelQueryCall(null, connectionToKill);
} catch (err) {
this.logger.warn(`Connection Error: ${err}`);
throw err;
}
}
_wrappedCancelQueryCall(emptyConnection, connectionToKill) {
return new Promise(function (resolve, reject) {
connectionToKill.native.cancel(function (err) {
if (err) {
reject(err);
return;
}
resolve(true);
});
});
}
}
module.exports = Client_PgNative;

View File

@ -0,0 +1,19 @@
const Transaction = require('../../../execution/transaction');
class Transaction_PG extends Transaction {
begin(conn) {
const trxMode = [
this.isolationLevel ? `ISOLATION LEVEL ${this.isolationLevel}` : '',
this.readOnly ? 'READ ONLY' : '',
]
.join(' ')
.trim();
if (trxMode.length === 0) {
return this.query(conn, 'BEGIN;');
}
return this.query(conn, `BEGIN TRANSACTION ${trxMode};`);
}
}
module.exports = Transaction_PG;

View File

@ -0,0 +1,361 @@
// PostgreSQL
// -------
const extend = require('lodash/extend');
const map = require('lodash/map');
const { promisify } = require('util');
const Client = require('../../client');
const Transaction = require('./execution/pg-transaction');
const QueryCompiler = require('./query/pg-querycompiler');
const QueryBuilder = require('./query/pg-querybuilder');
const ColumnCompiler = require('./schema/pg-columncompiler');
const TableCompiler = require('./schema/pg-tablecompiler');
const ViewCompiler = require('./schema/pg-viewcompiler');
const ViewBuilder = require('./schema/pg-viewbuilder');
const SchemaCompiler = require('./schema/pg-compiler');
const { makeEscape } = require('../../util/string');
const { isString } = require('../../util/is');
class Client_PG extends Client {
constructor(config) {
super(config);
if (config.returning) {
this.defaultReturning = config.returning;
}
if (config.searchPath) {
this.searchPath = config.searchPath;
}
}
transaction() {
return new Transaction(this, ...arguments);
}
queryBuilder() {
return new QueryBuilder(this);
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
viewBuilder() {
return new ViewBuilder(this, ...arguments);
}
_driver() {
return require('pg');
}
wrapIdentifierImpl(value) {
if (value === '*') return value;
let arrayAccessor = '';
const arrayAccessorMatch = value.match(/(.*?)(\[[0-9]+\])/);
if (arrayAccessorMatch) {
value = arrayAccessorMatch[1];
arrayAccessor = arrayAccessorMatch[2];
}
return `"${value.replace(/"/g, '""')}"${arrayAccessor}`;
}
_acquireOnlyConnection() {
const connection = new this.driver.Client(this.connectionSettings);
connection.on('error', (err) => {
connection.__knex__disposed = err;
});
connection.on('end', (err) => {
connection.__knex__disposed = err || 'Connection ended unexpectedly';
});
return connection.connect().then(() => connection);
}
// Get a raw connection, called by the `pool` whenever a new
// connection needs to be added to the pool.
acquireRawConnection() {
const client = this;
return this._acquireOnlyConnection()
.then(function (connection) {
if (!client.version) {
return client.checkVersion(connection).then(function (version) {
client.version = version;
return connection;
});
}
return connection;
})
.then(async function setSearchPath(connection) {
await client.setSchemaSearchPath(connection);
return connection;
});
}
// Used to explicitly close a connection, called internally by the pool
// when a connection times out or the pool is shutdown.
async destroyRawConnection(connection) {
const end = promisify((cb) => connection.end(cb));
return end();
}
// In PostgreSQL, we need to do a version check to do some feature
// checking on the database.
checkVersion(connection) {
return new Promise((resolve, reject) => {
connection.query('select version();', (err, resp) => {
if (err) return reject(err);
resolve(this._parseVersion(resp.rows[0].version));
});
});
}
_parseVersion(versionString) {
return /^PostgreSQL (.*?)( |$)/.exec(versionString)[1];
}
// Position the bindings for the query. The escape sequence for question mark
// is \? (e.g. knex.raw("\\?") since javascript requires '\' to be escaped too...)
positionBindings(sql) {
let questionCount = 0;
return sql.replace(/(\\*)(\?)/g, function (match, escapes) {
if (escapes.length % 2) {
return '?';
} else {
questionCount++;
return `$${questionCount}`;
}
});
}
setSchemaSearchPath(connection, searchPath) {
let path = searchPath || this.searchPath;
if (!path) return Promise.resolve(true);
if (!Array.isArray(path) && !isString(path)) {
throw new TypeError(
`knex: Expected searchPath to be Array/String, got: ${typeof path}`
);
}
if (isString(path)) {
if (path.includes(',')) {
const parts = path.split(',');
const arraySyntax = `[${parts
.map((searchPath) => `'${searchPath}'`)
.join(', ')}]`;
this.logger.warn(
`Detected comma in searchPath "${path}".` +
`If you are trying to specify multiple schemas, use Array syntax: ${arraySyntax}`
);
}
path = [path];
}
path = path.map((schemaName) => `"${schemaName}"`).join(',');
return new Promise(function (resolver, rejecter) {
connection.query(`set search_path to ${path}`, function (err) {
if (err) return rejecter(err);
resolver(true);
});
});
}
_stream(connection, obj, stream, options) {
if (!obj.sql) throw new Error('The query is empty');
const PGQueryStream = process.browser
? undefined
: require('pg-query-stream');
const sql = obj.sql;
return new Promise(function (resolver, rejecter) {
const queryStream = connection.query(
new PGQueryStream(sql, obj.bindings, options),
(err) => {
rejecter(err);
}
);
queryStream.on('error', function (error) {
rejecter(error);
stream.emit('error', error);
});
// 'end' IS propagated by .pipe, by default
stream.on('end', resolver);
queryStream.pipe(stream);
});
}
// Runs the query on the specified connection, providing the bindings
// and any other necessary prep work.
_query(connection, obj) {
if (!obj.sql) throw new Error('The query is empty');
let queryConfig = {
text: obj.sql,
values: obj.bindings || [],
};
if (obj.options) {
queryConfig = extend(queryConfig, obj.options);
}
return new Promise(function (resolver, rejecter) {
connection.query(queryConfig, function (err, response) {
if (err) return rejecter(err);
obj.response = response;
resolver(obj);
});
});
}
// Ensures the response is returned in the same format as other clients.
processResponse(obj, runner) {
const resp = obj.response;
if (obj.output) return obj.output.call(runner, resp);
if (obj.method === 'raw') return resp;
const { returning } = obj;
if (resp.command === 'SELECT') {
if (obj.method === 'first') return resp.rows[0];
if (obj.method === 'pluck') return map(resp.rows, obj.pluck);
return resp.rows;
}
if (returning) {
const returns = [];
for (let i = 0, l = resp.rows.length; i < l; i++) {
const row = resp.rows[i];
returns[i] = row;
}
return returns;
}
if (resp.command === 'UPDATE' || resp.command === 'DELETE') {
return resp.rowCount;
}
return resp;
}
async cancelQuery(connectionToKill) {
const conn = await this.acquireRawConnection();
try {
return await this._wrappedCancelQueryCall(conn, connectionToKill);
} finally {
await this.destroyRawConnection(conn).catch((err) => {
this.logger.warn(`Connection Error: ${err}`);
});
}
}
_wrappedCancelQueryCall(conn, connectionToKill) {
return this._query(conn, {
sql: 'SELECT pg_cancel_backend($1);',
bindings: [connectionToKill.processID],
options: {},
});
}
toPathForJson(jsonPath) {
const PG_PATH_REGEX = /^{.*}$/;
if (jsonPath.match(PG_PATH_REGEX)) {
return jsonPath;
}
return (
'{' +
jsonPath
.replace(/^(\$\.)/, '') // remove the first dollar
.replace('.', ',')
.replace(/\[([0-9]+)]/, ',$1') + // transform [number] to ,number
'}'
);
}
}
Object.assign(Client_PG.prototype, {
dialect: 'postgresql',
driverName: 'pg',
canCancelQuery: true,
_escapeBinding: makeEscape({
escapeArray(val, esc) {
return esc(arrayString(val, esc));
},
escapeString(str) {
let hasBackslash = false;
let escaped = "'";
for (let i = 0; i < str.length; i++) {
const c = str[i];
if (c === "'") {
escaped += c + c;
} else if (c === '\\') {
escaped += c + c;
hasBackslash = true;
} else {
escaped += c;
}
}
escaped += "'";
if (hasBackslash === true) {
escaped = 'E' + escaped;
}
return escaped;
},
escapeObject(val, prepareValue, timezone, seen = []) {
if (val && typeof val.toPostgres === 'function') {
seen = seen || [];
if (seen.indexOf(val) !== -1) {
throw new Error(
`circular reference detected while preparing "${val}" for query`
);
}
seen.push(val);
return prepareValue(val.toPostgres(prepareValue), seen);
}
return JSON.stringify(val);
},
}),
});
function arrayString(arr, esc) {
let result = '{';
for (let i = 0; i < arr.length; i++) {
if (i > 0) result += ',';
const val = arr[i];
if (val === null || typeof val === 'undefined') {
result += 'NULL';
} else if (Array.isArray(val)) {
result += arrayString(val, esc);
} else if (typeof val === 'number') {
result += val;
} else {
result += JSON.stringify(typeof val === 'string' ? val : esc(val));
}
}
return result + '}';
}
module.exports = Client_PG;

View File

@ -0,0 +1,43 @@
const QueryBuilder = require('../../../query/querybuilder.js');
module.exports = class QueryBuilder_PostgreSQL extends QueryBuilder {
updateFrom(name) {
this._single.updateFrom = name;
return this;
}
using(tables) {
this._single.using = tables;
return this;
}
withMaterialized(alias, statementOrColumnList, nothingOrStatement) {
this._validateWithArgs(
alias,
statementOrColumnList,
nothingOrStatement,
'with'
);
return this.withWrapped(
alias,
statementOrColumnList,
nothingOrStatement,
true
);
}
withNotMaterialized(alias, statementOrColumnList, nothingOrStatement) {
this._validateWithArgs(
alias,
statementOrColumnList,
nothingOrStatement,
'with'
);
return this.withWrapped(
alias,
statementOrColumnList,
nothingOrStatement,
false
);
}
};

View File

@ -0,0 +1,400 @@
// PostgreSQL Query Builder & Compiler
// ------
const identity = require('lodash/identity');
const reduce = require('lodash/reduce');
const QueryCompiler = require('../../../query/querycompiler');
const {
wrapString,
columnize: columnize_,
operator: operator_,
wrap: wrap_,
} = require('../../../formatter/wrappingFormatter');
class QueryCompiler_PG extends QueryCompiler {
constructor(client, builder, formatter) {
super(client, builder, formatter);
this._defaultInsertValue = 'default';
}
// Compiles a truncate query.
truncate() {
return `truncate ${this.tableName} restart identity`;
}
// is used if the an array with multiple empty values supplied
// Compiles an `insert` query, allowing for multiple
// inserts using a single query statement.
insert() {
let sql = super.insert();
if (sql === '') return sql;
const { returning, onConflict, ignore, merge, insert } = this.single;
if (onConflict && ignore) sql += this._ignore(onConflict);
if (onConflict && merge) {
sql += this._merge(merge.updates, onConflict, insert);
const wheres = this.where();
if (wheres) sql += ` ${wheres}`;
}
if (returning) sql += this._returning(returning);
return {
sql,
returning,
};
}
// Compiles an `update` query, allowing for a return value.
update() {
const withSQL = this.with();
const updateData = this._prepUpdate(this.single.update);
const wheres = this.where();
const { returning, updateFrom } = this.single;
return {
sql:
withSQL +
`update ${this.single.only ? 'only ' : ''}${this.tableName} ` +
`set ${updateData.join(', ')}` +
this._updateFrom(updateFrom) +
(wheres ? ` ${wheres}` : '') +
this._returning(returning),
returning,
};
}
using() {
const usingTables = this.single.using;
if (!usingTables) return;
let sql = 'using ';
if (Array.isArray(usingTables)) {
sql += usingTables
.map((table) => {
return this.formatter.wrap(table);
})
.join(',');
} else {
sql += this.formatter.wrap(usingTables);
}
return sql;
}
// Compiles an `delete` query, allowing for a return value.
del() {
// Make sure tableName is processed by the formatter first.
const { tableName } = this;
const withSQL = this.with();
let wheres = this.where() || '';
let using = this.using() || '';
const joins = this.grouped.join;
const tableJoins = [];
if (Array.isArray(joins)) {
for (const join of joins) {
tableJoins.push(
wrap_(
this._joinTable(join),
undefined,
this.builder,
this.client,
this.bindingsHolder
)
);
const joinWheres = [];
for (const clause of join.clauses) {
joinWheres.push(
this.whereBasic({
column: clause.column,
operator: '=',
value: clause.value,
asColumn: true,
})
);
}
if (joinWheres.length > 0) {
wheres += (wheres ? ' and ' : 'where ') + joinWheres.join(' and ');
}
}
if (tableJoins.length > 0) {
using += (using ? ',' : 'using ') + tableJoins.join(',');
}
}
// With 'using' syntax, no tablename between DELETE and FROM.
const sql =
withSQL +
`delete from ${this.single.only ? 'only ' : ''}${tableName}` +
(using ? ` ${using}` : '') +
(wheres ? ` ${wheres}` : '');
const { returning } = this.single;
return {
sql: sql + this._returning(returning),
returning,
};
}
aggregate(stmt) {
return this._aggregate(stmt, { distinctParentheses: true });
}
_returning(value) {
return value ? ` returning ${this.formatter.columnize(value)}` : '';
}
_updateFrom(name) {
return name ? ` from ${this.formatter.wrap(name)}` : '';
}
_ignore(columns) {
if (columns === true) {
return ' on conflict do nothing';
}
return ` on conflict ${this._onConflictClause(columns)} do nothing`;
}
_merge(updates, columns, insert) {
let sql = ` on conflict ${this._onConflictClause(columns)} do update set `;
if (updates && Array.isArray(updates)) {
sql += updates
.map((column) =>
wrapString(
column.split('.').pop(),
this.formatter.builder,
this.client,
this.formatter
)
)
.map((column) => `${column} = excluded.${column}`)
.join(', ');
return sql;
} else if (updates && typeof updates === 'object') {
const updateData = this._prepUpdate(updates);
if (typeof updateData === 'string') {
sql += updateData;
} else {
sql += updateData.join(',');
}
return sql;
} else {
const insertData = this._prepInsert(insert);
if (typeof insertData === 'string') {
throw new Error(
'If using merge with a raw insert query, then updates must be provided'
);
}
sql += insertData.columns
.map((column) =>
wrapString(column.split('.').pop(), this.builder, this.client)
)
.map((column) => `${column} = excluded.${column}`)
.join(', ');
return sql;
}
}
// Join array of table names and apply default schema.
_tableNames(tables) {
const schemaName = this.single.schema;
const sql = [];
for (let i = 0; i < tables.length; i++) {
let tableName = tables[i];
if (tableName) {
if (schemaName) {
tableName = `${schemaName}.${tableName}`;
}
sql.push(this.formatter.wrap(tableName));
}
}
return sql.join(', ');
}
_lockingClause(lockMode) {
const tables = this.single.lockTables || [];
return lockMode + (tables.length ? ' of ' + this._tableNames(tables) : '');
}
_groupOrder(item, type) {
return super._groupOrderNulls(item, type);
}
forUpdate() {
return this._lockingClause('for update');
}
forShare() {
return this._lockingClause('for share');
}
forNoKeyUpdate() {
return this._lockingClause('for no key update');
}
forKeyShare() {
return this._lockingClause('for key share');
}
skipLocked() {
return 'skip locked';
}
noWait() {
return 'nowait';
}
// Compiles a columnInfo query
columnInfo() {
const column = this.single.columnInfo;
let schema = this.single.schema;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
if (schema) {
schema = this.client.customWrapIdentifier(schema, identity);
}
const sql =
'select * from information_schema.columns where table_name = ? and table_catalog = current_database()';
const bindings = [table];
return this._buildColumnInfoQuery(schema, sql, bindings, column);
}
_buildColumnInfoQuery(schema, sql, bindings, column) {
if (schema) {
sql += ' and table_schema = ?';
bindings.push(schema);
} else {
sql += ' and table_schema = current_schema()';
}
return {
sql,
bindings,
output(resp) {
const out = reduce(
resp.rows,
function (columns, val) {
columns[val.column_name] = {
type: val.data_type,
maxLength: val.character_maximum_length,
nullable: val.is_nullable === 'YES',
defaultValue: val.column_default,
};
return columns;
},
{}
);
return (column && out[column]) || out;
},
};
}
distinctOn(value) {
return 'distinct on (' + this.formatter.columnize(value) + ') ';
}
// Json functions
jsonExtract(params) {
return this._jsonExtract('jsonb_path_query', params);
}
jsonSet(params) {
return this._jsonSet(
'jsonb_set',
Object.assign({}, params, {
path: this.client.toPathForJson(params.path),
})
);
}
jsonInsert(params) {
return this._jsonSet(
'jsonb_insert',
Object.assign({}, params, {
path: this.client.toPathForJson(params.path),
})
);
}
jsonRemove(params) {
const jsonCol = `${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)} #- ${this.client.parameter(
this.client.toPathForJson(params.path),
this.builder,
this.bindingsHolder
)}`;
return params.alias
? this.client.alias(jsonCol, this.formatter.wrap(params.alias))
: jsonCol;
}
whereJsonPath(statement) {
let castValue = '';
if (!isNaN(statement.value) && parseInt(statement.value)) {
castValue = '::int';
} else if (!isNaN(statement.value) && parseFloat(statement.value)) {
castValue = '::float';
} else {
castValue = " #>> '{}'";
}
return `jsonb_path_query_first(${this._columnClause(
statement
)}, ${this.client.parameter(
statement.jsonPath,
this.builder,
this.bindingsHolder
)})${castValue} ${operator_(
statement.operator,
this.builder,
this.client,
this.bindingsHolder
)} ${this._jsonValueClause(statement)}`;
}
whereJsonSupersetOf(statement) {
return this._not(
statement,
`${wrap_(
statement.column,
undefined,
this.builder,
this.client,
this.bindingsHolder
)} @> ${this._jsonValueClause(statement)}`
);
}
whereJsonSubsetOf(statement) {
return this._not(
statement,
`${columnize_(
statement.column,
this.builder,
this.client,
this.bindingsHolder
)} <@ ${this._jsonValueClause(statement)}`
);
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('jsonb_path_query_first', clause);
}
}
module.exports = QueryCompiler_PG;

View File

@ -0,0 +1,156 @@
// PostgreSQL Column Compiler
// -------
const ColumnCompiler = require('../../../schema/columncompiler');
const { isObject } = require('../../../util/is');
const { toNumber } = require('../../../util/helpers');
const commentEscapeRegex = /(?<!')'(?!')/g;
class ColumnCompiler_PG extends ColumnCompiler {
constructor(client, tableCompiler, columnBuilder) {
super(client, tableCompiler, columnBuilder);
this.modifiers = ['nullable', 'defaultTo', 'comment'];
this._addCheckModifiers();
}
// Types
// ------
bit(column) {
return column.length !== false ? `bit(${column.length})` : 'bit';
}
// Create the column definition for an enum type.
// Using method "2" here: http://stackoverflow.com/a/10984951/525714
enu(allowed, options) {
options = options || {};
const values =
options.useNative && options.existingType
? undefined
: allowed.join("', '");
if (options.useNative) {
let enumName = '';
const schemaName = options.schemaName || this.tableCompiler.schemaNameRaw;
if (schemaName) {
enumName += `"${schemaName}".`;
}
enumName += `"${options.enumName}"`;
if (!options.existingType) {
this.tableCompiler.unshiftQuery(
`create type ${enumName} as enum ('${values}')`
);
}
return enumName;
}
return `text check (${this.formatter.wrap(this.args[0])} in ('${values}'))`;
}
decimal(precision, scale) {
if (precision === null) return 'decimal';
return `decimal(${toNumber(precision, 8)}, ${toNumber(scale, 2)})`;
}
json(jsonb) {
if (jsonb) this.client.logger.deprecate('json(true)', 'jsonb()');
return jsonColumn(this.client, jsonb);
}
jsonb() {
return jsonColumn(this.client, true);
}
checkRegex(regex, constraintName) {
return this._check(
`${this.formatter.wrap(
this.getColumnName()
)} ~ ${this.client._escapeBinding(regex)}`,
constraintName
);
}
datetime(withoutTz = false, precision) {
let useTz;
if (isObject(withoutTz)) {
({ useTz, precision } = withoutTz);
} else {
useTz = !withoutTz;
}
useTz = typeof useTz === 'boolean' ? useTz : true;
precision =
precision !== undefined && precision !== null
? '(' + precision + ')'
: '';
return `${useTz ? 'timestamptz' : 'timestamp'}${precision}`;
}
timestamp(withoutTz = false, precision) {
return this.datetime(withoutTz, precision);
}
// Modifiers:
// ------
comment(comment) {
const columnName = this.args[0] || this.defaults('columnName');
const escapedComment = comment
? `'${comment.replace(commentEscapeRegex, "''")}'`
: 'NULL';
this.pushAdditional(function () {
this.pushQuery(
`comment on column ${this.tableCompiler.tableName()}.` +
this.formatter.wrap(columnName) +
` is ${escapedComment}`
);
}, comment);
}
increments(options = { primaryKey: true }) {
return (
'serial' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
bigincrements(options = { primaryKey: true }) {
return (
'bigserial' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
uuid(options = { primaryKey: false }) {
return (
'uuid' +
(this.tableCompiler._canBeAddPrimaryKey(options) ? ' primary key' : '')
);
}
}
ColumnCompiler_PG.prototype.bigint = 'bigint';
ColumnCompiler_PG.prototype.binary = 'bytea';
ColumnCompiler_PG.prototype.bool = 'boolean';
ColumnCompiler_PG.prototype.double = 'double precision';
ColumnCompiler_PG.prototype.floating = 'real';
ColumnCompiler_PG.prototype.smallint = 'smallint';
ColumnCompiler_PG.prototype.tinyint = 'smallint';
function jsonColumn(client, jsonb) {
if (
!client.version ||
client.config.client === 'cockroachdb' ||
client.config.jsonbSupport === true ||
parseFloat(client.version) >= 9.2
) {
return jsonb ? 'jsonb' : 'json';
}
return 'text';
}
module.exports = ColumnCompiler_PG;

View File

@ -0,0 +1,138 @@
// PostgreSQL Schema Compiler
// -------
const SchemaCompiler = require('../../../schema/compiler');
class SchemaCompiler_PG extends SchemaCompiler {
constructor(client, builder) {
super(client, builder);
}
// Check whether the current table
hasTable(tableName) {
let sql = 'select * from information_schema.tables where table_name = ?';
const bindings = [tableName];
if (this.schema) {
sql += ' and table_schema = ?';
bindings.push(this.schema);
} else {
sql += ' and table_schema = current_schema()';
}
this.pushQuery({
sql,
bindings,
output(resp) {
return resp.rows.length > 0;
},
});
}
// Compile the query to determine if a column exists in a table.
hasColumn(tableName, columnName) {
let sql =
'select * from information_schema.columns where table_name = ? and column_name = ?';
const bindings = [tableName, columnName];
if (this.schema) {
sql += ' and table_schema = ?';
bindings.push(this.schema);
} else {
sql += ' and table_schema = current_schema()';
}
this.pushQuery({
sql,
bindings,
output(resp) {
return resp.rows.length > 0;
},
});
}
qualifiedTableName(tableName) {
const name = this.schema ? `${this.schema}.${tableName}` : tableName;
return this.formatter.wrap(name);
}
// Compile a rename table command.
renameTable(from, to) {
this.pushQuery(
`alter table ${this.qualifiedTableName(
from
)} rename to ${this.formatter.wrap(to)}`
);
}
createSchema(schemaName) {
this.pushQuery(`create schema ${this.formatter.wrap(schemaName)}`);
}
createSchemaIfNotExists(schemaName) {
this.pushQuery(
`create schema if not exists ${this.formatter.wrap(schemaName)}`
);
}
dropSchema(schemaName, cascade = false) {
this.pushQuery(
`drop schema ${this.formatter.wrap(schemaName)}${
cascade ? ' cascade' : ''
}`
);
}
dropSchemaIfExists(schemaName, cascade = false) {
this.pushQuery(
`drop schema if exists ${this.formatter.wrap(schemaName)}${
cascade ? ' cascade' : ''
}`
);
}
dropExtension(extensionName) {
this.pushQuery(`drop extension ${this.formatter.wrap(extensionName)}`);
}
dropExtensionIfExists(extensionName) {
this.pushQuery(
`drop extension if exists ${this.formatter.wrap(extensionName)}`
);
}
createExtension(extensionName) {
this.pushQuery(`create extension ${this.formatter.wrap(extensionName)}`);
}
createExtensionIfNotExists(extensionName) {
this.pushQuery(
`create extension if not exists ${this.formatter.wrap(extensionName)}`
);
}
renameView(from, to) {
this.pushQuery(
this.alterViewPrefix +
`${this.formatter.wrap(from)} rename to ${this.formatter.wrap(to)}`
);
}
refreshMaterializedView(viewName, concurrently = false) {
this.pushQuery({
sql: `refresh materialized view${
concurrently ? ' concurrently' : ''
} ${this.formatter.wrap(viewName)}`,
});
}
dropMaterializedView(viewName) {
this._dropView(viewName, false, true);
}
dropMaterializedViewIfExists(viewName) {
this._dropView(viewName, true, true);
}
}
module.exports = SchemaCompiler_PG;

View File

@ -0,0 +1,304 @@
/* eslint max-len: 0 */
// PostgreSQL Table Builder & Compiler
// -------
const has = require('lodash/has');
const TableCompiler = require('../../../schema/tablecompiler');
const { isObject, isString } = require('../../../util/is');
class TableCompiler_PG extends TableCompiler {
constructor(client, tableBuilder) {
super(client, tableBuilder);
}
// Compile a rename column command.
renameColumn(from, to) {
return this.pushQuery({
sql: `alter table ${this.tableName()} rename ${this.formatter.wrap(
from
)} to ${this.formatter.wrap(to)}`,
});
}
_setNullableState(column, isNullable) {
const constraintAction = isNullable ? 'drop not null' : 'set not null';
const sql = `alter table ${this.tableName()} alter column ${this.formatter.wrap(
column
)} ${constraintAction}`;
return this.pushQuery({
sql: sql,
});
}
compileAdd(builder) {
const table = this.formatter.wrap(builder);
const columns = this.prefixArray('add column', this.getColumns(builder));
return this.pushQuery({
sql: `alter table ${table} ${columns.join(', ')}`,
});
}
// Adds the "create" query to the query sequence.
createQuery(columns, ifNot, like) {
const createStatement = ifNot
? 'create table if not exists '
: 'create table ';
const columnsSql = ` (${columns.sql.join(', ')}${
this.primaryKeys() || ''
}${this._addChecks()})`;
let sql =
createStatement +
this.tableName() +
(like && this.tableNameLike()
? ' (like ' +
this.tableNameLike() +
' including all' +
(columns.sql.length ? ', ' + columns.sql.join(', ') : '') +
')'
: columnsSql);
if (this.single.inherits)
sql += ` inherits (${this.formatter.wrap(this.single.inherits)})`;
this.pushQuery({
sql,
bindings: columns.bindings,
});
const hasComment = has(this.single, 'comment');
if (hasComment) this.comment(this.single.comment);
}
primaryKeys() {
const pks = (this.grouped.alterTable || []).filter(
(k) => k.method === 'primary'
);
if (pks.length > 0 && pks[0].args.length > 0) {
const columns = pks[0].args[0];
let constraintName = pks[0].args[1] || '';
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
deferrable = deferrable ? ` deferrable initially ${deferrable}` : '';
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
return `, constraint ${constraintName} primary key (${this.formatter.columnize(
columns
)})${deferrable}`;
}
}
addColumns(columns, prefix, colCompilers) {
if (prefix === this.alterColumnsPrefix) {
// alter columns
for (const col of colCompilers) {
this._addColumn(col);
}
} else {
// base class implementation for normal add
super.addColumns(columns, prefix);
}
}
_addColumn(col) {
const quotedTableName = this.tableName();
const type = col.getColumnType();
// We'd prefer to call this.formatter.wrapAsIdentifier here instead, however the context passed to
// `this` instance is not that of the column, but of the table. Thus, we unfortunately have to call
// `wrapIdentifier` here as well (it is already called once on the initial column operation) to give
// our `alter` operation the correct `queryContext`. Refer to issue #2606 and PR #2612.
const colName = this.client.wrapIdentifier(
col.getColumnName(),
col.columnBuilder.queryContext()
);
// To alter enum columns they must be cast to text first
const isEnum = col.type === 'enu';
this.pushQuery({
sql: `alter table ${quotedTableName} alter column ${colName} drop default`,
bindings: [],
});
const alterNullable = col.columnBuilder.alterNullable;
if (alterNullable) {
this.pushQuery({
sql: `alter table ${quotedTableName} alter column ${colName} drop not null`,
bindings: [],
});
}
const alterType = col.columnBuilder.alterType;
if (alterType) {
this.pushQuery({
sql: `alter table ${quotedTableName} alter column ${colName} type ${type} using (${colName}${
isEnum ? '::text::' : '::'
}${type})`,
bindings: [],
});
}
const defaultTo = col.modified['defaultTo'];
if (defaultTo) {
const modifier = col.defaultTo.apply(col, defaultTo);
this.pushQuery({
sql: `alter table ${quotedTableName} alter column ${colName} set ${modifier}`,
bindings: [],
});
}
if (alterNullable) {
const nullable = col.modified['nullable'];
if (nullable && nullable[0] === false) {
this.pushQuery({
sql: `alter table ${quotedTableName} alter column ${colName} set not null`,
bindings: [],
});
}
}
}
// Compiles the comment on the table.
comment(comment) {
this.pushQuery(
`comment on table ${this.tableName()} is '${this.single.comment}'`
);
}
// Indexes:
// -------
primary(columns, constraintName) {
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
deferrable = deferrable ? ` deferrable initially ${deferrable}` : '';
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(`${this.tableNameRaw}_pkey`);
if (this.method !== 'create' && this.method !== 'createIfNot') {
this.pushQuery(
`alter table ${this.tableName()} add constraint ${constraintName} primary key (${this.formatter.columnize(
columns
)})${deferrable}`
);
}
}
unique(columns, indexName) {
let deferrable;
let useConstraint = true;
let predicate;
if (isObject(indexName)) {
({ indexName, deferrable, useConstraint, predicate } = indexName);
if (useConstraint === undefined) {
useConstraint = !!deferrable || !predicate;
}
}
if (!useConstraint && deferrable && deferrable !== 'not deferrable') {
throw new Error('postgres cannot create deferrable index');
}
if (useConstraint && predicate) {
throw new Error('postgres cannot create constraint with predicate');
}
deferrable = deferrable ? ` deferrable initially ${deferrable}` : '';
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
if (useConstraint) {
this.pushQuery(
`alter table ${this.tableName()} add constraint ${indexName}` +
' unique (' +
this.formatter.columnize(columns) +
')' +
deferrable
);
} else {
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: '';
this.pushQuery(
`create unique index ${indexName} on ${this.tableName()} (${this.formatter.columnize(
columns
)})${predicateQuery}`
);
}
}
index(columns, indexName, options) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
let predicate;
let storageEngineIndexType;
let indexType;
if (isString(options)) {
storageEngineIndexType = options;
} else if (isObject(options)) {
({ indexType, storageEngineIndexType, predicate } = options);
}
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: '';
this.pushQuery(
`create${
typeof indexType === 'string' && indexType.toLowerCase() === 'unique'
? ' unique'
: ''
} index ${indexName} on ${this.tableName()}${
(storageEngineIndexType && ` using ${storageEngineIndexType}`) || ''
}` +
' (' +
this.formatter.columnize(columns) +
')' +
`${predicateQuery}`
);
}
dropPrimary(constraintName) {
constraintName = constraintName
? this.formatter.wrap(constraintName)
: this.formatter.wrap(this.tableNameRaw + '_pkey');
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${constraintName}`
);
}
dropIndex(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
indexName = this.schemaNameRaw
? `${this.formatter.wrap(this.schemaNameRaw)}.${indexName}`
: indexName;
this.pushQuery(`drop index ${indexName}`);
}
dropUnique(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${indexName}`
);
}
dropForeign(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('foreign', this.tableNameRaw, columns);
this.pushQuery(
`alter table ${this.tableName()} drop constraint ${indexName}`
);
}
}
module.exports = TableCompiler_PG;

View File

@ -0,0 +1,21 @@
const ViewBuilder = require('../../../schema/viewbuilder.js');
class ViewBuilder_PG extends ViewBuilder {
constructor() {
super(...arguments);
}
checkOption() {
this._single.checkOption = 'default_option';
}
localCheckOption() {
this._single.checkOption = 'local';
}
cascadedCheckOption() {
this._single.checkOption = 'cascaded';
}
}
module.exports = ViewBuilder_PG;

View File

@ -0,0 +1,35 @@
/* eslint max-len: 0 */
const ViewCompiler = require('../../../schema/viewcompiler.js');
class ViewCompiler_PG extends ViewCompiler {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
renameColumn(from, to) {
return this.pushQuery({
sql: `alter view ${this.viewName()} rename ${this.formatter.wrap(
from
)} to ${this.formatter.wrap(to)}`,
});
}
defaultTo(column, defaultValue) {
return this.pushQuery({
sql: `alter view ${this.viewName()} alter ${this.formatter.wrap(
column
)} set default ${defaultValue}`,
});
}
createOrReplace() {
this.createQuery(this.columns, this.selectQuery, false, true);
}
createMaterializedView() {
this.createQuery(this.columns, this.selectQuery, true);
}
}
module.exports = ViewCompiler_PG;

View File

@ -0,0 +1,86 @@
// Redshift
// -------
const Client_PG = require('../postgres');
const map = require('lodash/map');
const Transaction = require('./transaction');
const QueryCompiler = require('./query/redshift-querycompiler');
const ColumnBuilder = require('./schema/redshift-columnbuilder');
const ColumnCompiler = require('./schema/redshift-columncompiler');
const TableCompiler = require('./schema/redshift-tablecompiler');
const SchemaCompiler = require('./schema/redshift-compiler');
const ViewCompiler = require('./schema/redshift-viewcompiler');
class Client_Redshift extends Client_PG {
transaction() {
return new Transaction(this, ...arguments);
}
queryCompiler(builder, formatter) {
return new QueryCompiler(this, builder, formatter);
}
columnBuilder() {
return new ColumnBuilder(this, ...arguments);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
viewCompiler() {
return new ViewCompiler(this, ...arguments);
}
_driver() {
return require('pg');
}
// Ensures the response is returned in the same format as other clients.
processResponse(obj, runner) {
const resp = obj.response;
if (obj.output) return obj.output.call(runner, resp);
if (obj.method === 'raw') return resp;
if (resp.command === 'SELECT') {
if (obj.method === 'first') return resp.rows[0];
if (obj.method === 'pluck') return map(resp.rows, obj.pluck);
return resp.rows;
}
if (
resp.command === 'INSERT' ||
resp.command === 'UPDATE' ||
resp.command === 'DELETE'
) {
return resp.rowCount;
}
return resp;
}
toPathForJson(jsonPath, builder, bindingsHolder) {
return jsonPath
.replace(/^(\$\.)/, '') // remove the first dollar
.split('.')
.map(
function (v) {
return this.parameter(v, builder, bindingsHolder);
}.bind(this)
)
.join(', ');
}
}
Object.assign(Client_Redshift.prototype, {
dialect: 'redshift',
driverName: 'pg-redshift',
});
module.exports = Client_Redshift;

View File

@ -0,0 +1,163 @@
// Redshift Query Builder & Compiler
// ------
const QueryCompiler = require('../../../query/querycompiler');
const QueryCompiler_PG = require('../../postgres/query/pg-querycompiler');
const identity = require('lodash/identity');
const {
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
class QueryCompiler_Redshift extends QueryCompiler_PG {
truncate() {
return `truncate ${this.tableName.toLowerCase()}`;
}
// Compiles an `insert` query, allowing for multiple
// inserts using a single query statement.
insert() {
const sql = QueryCompiler.prototype.insert.apply(this, arguments);
if (sql === '') return sql;
this._slightReturn();
return {
sql,
};
}
// Compiles an `update` query, warning on unsupported returning
update() {
const sql = QueryCompiler.prototype.update.apply(this, arguments);
this._slightReturn();
return {
sql,
};
}
// Compiles an `delete` query, warning on unsupported returning
del() {
const sql = QueryCompiler.prototype.del.apply(this, arguments);
this._slightReturn();
return {
sql,
};
}
// simple: if trying to return, warn
_slightReturn() {
if (this.single.isReturning) {
this.client.logger.warn(
'insert/update/delete returning is not supported by redshift dialect'
);
}
}
forUpdate() {
this.client.logger.warn('table lock is not supported by redshift dialect');
return '';
}
forShare() {
this.client.logger.warn(
'lock for share is not supported by redshift dialect'
);
return '';
}
forNoKeyUpdate() {
this.client.logger.warn('table lock is not supported by redshift dialect');
return '';
}
forKeyShare() {
this.client.logger.warn(
'lock for share is not supported by redshift dialect'
);
return '';
}
// Compiles a columnInfo query
columnInfo() {
const column = this.single.columnInfo;
let schema = this.single.schema;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
if (schema) {
schema = this.client.customWrapIdentifier(schema, identity);
}
const sql =
'select * from information_schema.columns where table_name = ? and table_catalog = ?';
const bindings = [
table.toLowerCase(),
this.client.database().toLowerCase(),
];
return this._buildColumnInfoQuery(schema, sql, bindings, column);
}
jsonExtract(params) {
let extractions;
if (Array.isArray(params.column)) {
extractions = params.column;
} else {
extractions = [params];
}
return extractions
.map((extraction) => {
const jsonCol = `json_extract_path_text(${columnize_(
extraction.column || extraction[0],
this.builder,
this.client,
this.bindingsHolder
)}, ${this.client.toPathForJson(
params.path || extraction[1],
this.builder,
this.bindingsHolder
)})`;
const alias = extraction.alias || extraction[2];
return alias
? this.client.alias(jsonCol, this.formatter.wrap(alias))
: jsonCol;
})
.join(', ');
}
jsonSet(params) {
throw new Error('Json set is not supported by Redshift');
}
jsonInsert(params) {
throw new Error('Json insert is not supported by Redshift');
}
jsonRemove(params) {
throw new Error('Json remove is not supported by Redshift');
}
whereJsonPath(statement) {
return this._whereJsonPath(
'json_extract_path_text',
Object.assign({}, statement, {
path: this.client.toPathForJson(statement.path),
})
);
}
whereJsonSupersetOf(statement) {
throw new Error('Json superset is not supported by Redshift');
}
whereJsonSubsetOf(statement) {
throw new Error('Json subset is not supported by Redshift');
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('json_extract_path_text', clause);
}
}
module.exports = QueryCompiler_Redshift;

View File

@ -0,0 +1,22 @@
const ColumnBuilder = require('../../../schema/columnbuilder');
class ColumnBuilder_Redshift extends ColumnBuilder {
constructor() {
super(...arguments);
}
// primary needs to set not null on non-preexisting columns, or fail
primary() {
this.notNullable();
return super.primary(...arguments);
}
index() {
this.client.logger.warn(
'Redshift does not support the creation of indexes.'
);
return this;
}
}
module.exports = ColumnBuilder_Redshift;

View File

@ -0,0 +1,67 @@
// Redshift Column Compiler
// -------
const ColumnCompiler_PG = require('../../postgres/schema/pg-columncompiler');
const ColumnCompiler = require('../../../schema/columncompiler');
class ColumnCompiler_Redshift extends ColumnCompiler_PG {
constructor() {
super(...arguments);
}
// Types:
// ------
bit(column) {
return column.length !== false ? `char(${column.length})` : 'char(1)';
}
datetime(without) {
return without ? 'timestamp' : 'timestamptz';
}
timestamp(without) {
return without ? 'timestamp' : 'timestamptz';
}
// Modifiers:
// ------
comment(comment) {
this.pushAdditional(function () {
this.pushQuery(
`comment on column ${this.tableCompiler.tableName()}.` +
this.formatter.wrap(this.args[0]) +
' is ' +
(comment ? `'${comment}'` : 'NULL')
);
}, comment);
}
}
ColumnCompiler_Redshift.prototype.increments = ({ primaryKey = true } = {}) =>
'integer identity(1,1)' + (primaryKey ? ' primary key' : '') + ' not null';
ColumnCompiler_Redshift.prototype.bigincrements = ({
primaryKey = true,
} = {}) =>
'bigint identity(1,1)' + (primaryKey ? ' primary key' : '') + ' not null';
ColumnCompiler_Redshift.prototype.binary = 'varchar(max)';
ColumnCompiler_Redshift.prototype.blob = 'varchar(max)';
ColumnCompiler_Redshift.prototype.enu = 'varchar(255)';
ColumnCompiler_Redshift.prototype.enum = 'varchar(255)';
ColumnCompiler_Redshift.prototype.json = 'varchar(max)';
ColumnCompiler_Redshift.prototype.jsonb = 'varchar(max)';
ColumnCompiler_Redshift.prototype.longblob = 'varchar(max)';
ColumnCompiler_Redshift.prototype.mediumblob = 'varchar(16777218)';
ColumnCompiler_Redshift.prototype.set = 'text';
ColumnCompiler_Redshift.prototype.text = 'varchar(max)';
ColumnCompiler_Redshift.prototype.tinyblob = 'varchar(256)';
ColumnCompiler_Redshift.prototype.uuid = ColumnCompiler.prototype.uuid;
ColumnCompiler_Redshift.prototype.varbinary = 'varchar(max)';
ColumnCompiler_Redshift.prototype.bigint = 'bigint';
ColumnCompiler_Redshift.prototype.bool = 'boolean';
ColumnCompiler_Redshift.prototype.double = 'double precision';
ColumnCompiler_Redshift.prototype.floating = 'real';
ColumnCompiler_Redshift.prototype.smallint = 'smallint';
ColumnCompiler_Redshift.prototype.tinyint = 'smallint';
module.exports = ColumnCompiler_Redshift;

View File

@ -0,0 +1,14 @@
/* eslint max-len: 0 */
// Redshift Table Builder & Compiler
// -------
const SchemaCompiler_PG = require('../../postgres/schema/pg-compiler');
class SchemaCompiler_Redshift extends SchemaCompiler_PG {
constructor() {
super(...arguments);
}
}
module.exports = SchemaCompiler_Redshift;

View File

@ -0,0 +1,122 @@
/* eslint max-len: 0 */
// Redshift Table Builder & Compiler
// -------
const has = require('lodash/has');
const TableCompiler_PG = require('../../postgres/schema/pg-tablecompiler');
class TableCompiler_Redshift extends TableCompiler_PG {
constructor() {
super(...arguments);
}
index(columns, indexName, options) {
this.client.logger.warn(
'Redshift does not support the creation of indexes.'
);
}
dropIndex(columns, indexName) {
this.client.logger.warn(
'Redshift does not support the deletion of indexes.'
);
}
// TODO: have to disable setting not null on columns that already exist...
// Adds the "create" query to the query sequence.
createQuery(columns, ifNot, like) {
const createStatement = ifNot
? 'create table if not exists '
: 'create table ';
const columnsSql = ' (' + columns.sql.join(', ') + this._addChecks() + ')';
let sql =
createStatement +
this.tableName() +
(like && this.tableNameLike()
? ' (like ' + this.tableNameLike() + ')'
: columnsSql);
if (this.single.inherits)
sql += ` like (${this.formatter.wrap(this.single.inherits)})`;
this.pushQuery({
sql,
bindings: columns.bindings,
});
const hasComment = has(this.single, 'comment');
if (hasComment) this.comment(this.single.comment);
if (like) {
this.addColumns(columns, this.addColumnsPrefix);
}
}
primary(columns, constraintName) {
const self = this;
constraintName = constraintName
? self.formatter.wrap(constraintName)
: self.formatter.wrap(`${this.tableNameRaw}_pkey`);
if (columns.constructor !== Array) {
columns = [columns];
}
const thiscolumns = self.grouped.columns;
if (thiscolumns) {
for (let i = 0; i < columns.length; i++) {
let exists = thiscolumns.find(
(tcb) =>
tcb.grouping === 'columns' &&
tcb.builder &&
tcb.builder._method === 'add' &&
tcb.builder._args &&
tcb.builder._args.indexOf(columns[i]) > -1
);
if (exists) {
exists = exists.builder;
}
const nullable = !(
exists &&
exists._modifiers &&
exists._modifiers['nullable'] &&
exists._modifiers['nullable'][0] === false
);
if (nullable) {
if (exists) {
return this.client.logger.warn(
'Redshift does not allow primary keys to contain nullable columns.'
);
} else {
return this.client.logger.warn(
'Redshift does not allow primary keys to contain nonexistent columns.'
);
}
}
}
}
return self.pushQuery(
`alter table ${self.tableName()} add constraint ${constraintName} primary key (${self.formatter.columnize(
columns
)})`
);
}
// Compiles column add. Redshift can only add one column per ALTER TABLE, so core addColumns doesn't work. #2545
addColumns(columns, prefix, colCompilers) {
if (prefix === this.alterColumnsPrefix) {
super.addColumns(columns, prefix, colCompilers);
} else {
prefix = prefix || this.addColumnsPrefix;
colCompilers = colCompilers || this.getColumns();
for (const col of colCompilers) {
const quotedTableName = this.tableName();
const colCompiled = col.compileColumn();
this.pushQuery({
sql: `alter table ${quotedTableName} ${prefix}${colCompiled}`,
bindings: [],
});
}
}
}
}
module.exports = TableCompiler_Redshift;

View File

@ -0,0 +1,11 @@
/* eslint max-len: 0 */
const ViewCompiler_PG = require('../../postgres/schema/pg-viewcompiler.js');
class ViewCompiler_Redshift extends ViewCompiler_PG {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
}
module.exports = ViewCompiler_Redshift;

View File

@ -0,0 +1,32 @@
const Transaction = require('../../execution/transaction');
module.exports = class Redshift_Transaction extends Transaction {
begin(conn) {
const trxMode = [
this.isolationLevel ? `ISOLATION LEVEL ${this.isolationLevel}` : '',
this.readOnly ? 'READ ONLY' : '',
]
.join(' ')
.trim();
if (trxMode.length === 0) {
return this.query(conn, 'BEGIN;');
}
return this.query(conn, `BEGIN ${trxMode};`);
}
savepoint(conn) {
this.trxClient.logger('Redshift does not support savepoints.');
return Promise.resolve();
}
release(conn, value) {
this.trxClient.logger('Redshift does not support savepoints.');
return Promise.resolve();
}
rollbackTo(conn, error) {
this.trxClient.logger('Redshift does not support savepoints.');
return Promise.resolve();
}
};

View File

@ -0,0 +1,25 @@
const Transaction = require('../../../execution/transaction');
class Transaction_Sqlite extends Transaction {
begin(conn) {
// SQLite doesn't really support isolation levels, it is serializable by
// default and so we override it to ignore isolation level.
// There is a `PRAGMA read_uncommitted = true;`, but that's probably not
// what the user wants
if (this.isolationLevel) {
this.client.logger.warn(
'sqlite3 only supports serializable transactions, ignoring the isolation level param'
);
}
// SQLite infers read vs write transactions from the statement operation
// https://www.sqlite.org/lang_transaction.html#read_transactions_versus_write_transactions
if (this.readOnly) {
this.client.logger.warn(
'sqlite3 implicitly handles read vs write transactions'
);
}
return this.query(conn, 'BEGIN;');
}
}
module.exports = Transaction_Sqlite;

View File

@ -0,0 +1,250 @@
// SQLite3
// -------
const defaults = require('lodash/defaults');
const map = require('lodash/map');
const { promisify } = require('util');
const Client = require('../../client');
const Raw = require('../../raw');
const Transaction = require('./execution/sqlite-transaction');
const SqliteQueryCompiler = require('./query/sqlite-querycompiler');
const SchemaCompiler = require('./schema/sqlite-compiler');
const ColumnCompiler = require('./schema/sqlite-columncompiler');
const TableCompiler = require('./schema/sqlite-tablecompiler');
const ViewCompiler = require('./schema/sqlite-viewcompiler');
const SQLite3_DDL = require('./schema/ddl');
const Formatter = require('../../formatter');
const QueryBuilder = require('./query/sqlite-querybuilder');
class Client_SQLite3 extends Client {
constructor(config) {
super(config);
if (config.connection && config.connection.filename === undefined) {
this.logger.warn(
'Could not find `connection.filename` in config. Please specify ' +
'the database path and name to avoid errors. ' +
'(see docs https://knexjs.org/guide/#configuration-options)'
);
}
if (config.useNullAsDefault === undefined) {
this.logger.warn(
'sqlite does not support inserting default values. Set the ' +
'`useNullAsDefault` flag to hide this warning. ' +
'(see docs https://knexjs.org/guide/query-builder.html#insert).'
);
}
}
_driver() {
return require('sqlite3');
}
schemaCompiler() {
return new SchemaCompiler(this, ...arguments);
}
transaction() {
return new Transaction(this, ...arguments);
}
queryCompiler(builder, formatter) {
return new SqliteQueryCompiler(this, builder, formatter);
}
queryBuilder() {
return new QueryBuilder(this);
}
viewCompiler(builder, formatter) {
return new ViewCompiler(this, builder, formatter);
}
columnCompiler() {
return new ColumnCompiler(this, ...arguments);
}
tableCompiler() {
return new TableCompiler(this, ...arguments);
}
ddl(compiler, pragma, connection) {
return new SQLite3_DDL(this, compiler, pragma, connection);
}
wrapIdentifierImpl(value) {
return value !== '*' ? `\`${value.replace(/`/g, '``')}\`` : '*';
}
// Get a raw connection from the database, returning a promise with the connection object.
acquireRawConnection() {
return new Promise((resolve, reject) => {
// the default mode for sqlite3
let flags = this.driver.OPEN_READWRITE | this.driver.OPEN_CREATE;
if (this.connectionSettings.flags) {
if (!Array.isArray(this.connectionSettings.flags)) {
throw new Error(`flags must be an array of strings`);
}
this.connectionSettings.flags.forEach((_flag) => {
if (!_flag.startsWith('OPEN_') || !this.driver[_flag]) {
throw new Error(`flag ${_flag} not supported by node-sqlite3`);
}
flags = flags | this.driver[_flag];
});
}
const db = new this.driver.Database(
this.connectionSettings.filename,
flags,
(err) => {
if (err) {
return reject(err);
}
resolve(db);
}
);
});
}
// Used to explicitly close a connection, called internally by the pool when
// a connection times out or the pool is shutdown.
async destroyRawConnection(connection) {
const close = promisify((cb) => connection.close(cb));
return close();
}
// Runs the query on the specified connection, providing the bindings and any
// other necessary prep work.
_query(connection, obj) {
if (!obj.sql) throw new Error('The query is empty');
const { method } = obj;
let callMethod;
switch (method) {
case 'insert':
case 'update':
callMethod = obj.returning ? 'all' : 'run';
break;
case 'counter':
case 'del':
callMethod = 'run';
break;
default:
callMethod = 'all';
}
return new Promise(function (resolver, rejecter) {
if (!connection || !connection[callMethod]) {
return rejecter(
new Error(`Error calling ${callMethod} on connection.`)
);
}
connection[callMethod](obj.sql, obj.bindings, function (err, response) {
if (err) return rejecter(err);
obj.response = response;
// We need the context here, as it contains
// the "this.lastID" or "this.changes"
obj.context = this;
return resolver(obj);
});
});
}
_stream(connection, obj, stream) {
if (!obj.sql) throw new Error('The query is empty');
const client = this;
return new Promise(function (resolver, rejecter) {
stream.on('error', rejecter);
stream.on('end', resolver);
return client
._query(connection, obj)
.then((obj) => obj.response)
.then((rows) => rows.forEach((row) => stream.write(row)))
.catch(function (err) {
stream.emit('error', err);
})
.then(function () {
stream.end();
});
});
}
// Ensures the response is returned in the same format as other clients.
processResponse(obj, runner) {
const ctx = obj.context;
const { response, returning } = obj;
if (obj.output) return obj.output.call(runner, response);
switch (obj.method) {
case 'select':
return response;
case 'first':
return response[0];
case 'pluck':
return map(response, obj.pluck);
case 'insert': {
if (returning) {
if (response) {
return response;
}
}
return [ctx.lastID];
}
case 'update': {
if (returning) {
if (response) {
return response;
}
}
return ctx.changes;
}
case 'del':
case 'counter':
return ctx.changes;
default: {
return response;
}
}
}
poolDefaults() {
return defaults({ min: 1, max: 1 }, super.poolDefaults());
}
formatter(builder) {
return new Formatter(this, builder);
}
values(values, builder, formatter) {
if (Array.isArray(values)) {
if (Array.isArray(values[0])) {
return `( values ${values
.map(
(value) =>
`(${this.parameterize(value, undefined, builder, formatter)})`
)
.join(', ')})`;
}
return `(${this.parameterize(values, undefined, builder, formatter)})`;
}
if (values instanceof Raw) {
return `(${this.parameter(values, builder, formatter)})`;
}
return this.parameter(values, builder, formatter);
}
}
Object.assign(Client_SQLite3.prototype, {
dialect: 'sqlite3',
driverName: 'sqlite3',
});
module.exports = Client_SQLite3;

View File

@ -0,0 +1,33 @@
const QueryBuilder = require('../../../query/querybuilder.js');
module.exports = class QueryBuilder_SQLite3 extends QueryBuilder {
withMaterialized(alias, statementOrColumnList, nothingOrStatement) {
this._validateWithArgs(
alias,
statementOrColumnList,
nothingOrStatement,
'with'
);
return this.withWrapped(
alias,
statementOrColumnList,
nothingOrStatement,
true
);
}
withNotMaterialized(alias, statementOrColumnList, nothingOrStatement) {
this._validateWithArgs(
alias,
statementOrColumnList,
nothingOrStatement,
'with'
);
return this.withWrapped(
alias,
statementOrColumnList,
nothingOrStatement,
false
);
}
};

View File

@ -0,0 +1,334 @@
// SQLite3 Query Builder & Compiler
const constant = require('lodash/constant');
const each = require('lodash/each');
const identity = require('lodash/identity');
const isEmpty = require('lodash/isEmpty');
const reduce = require('lodash/reduce');
const QueryCompiler = require('../../../query/querycompiler');
const noop = require('../../../util/noop');
const { isString } = require('../../../util/is');
const {
wrapString,
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
const emptyStr = constant('');
class QueryCompiler_SQLite3 extends QueryCompiler {
constructor(client, builder, formatter) {
super(client, builder, formatter);
// The locks are not applicable in SQLite3
this.forShare = emptyStr;
this.forKeyShare = emptyStr;
this.forUpdate = emptyStr;
this.forNoKeyUpdate = emptyStr;
}
// SQLite requires us to build the multi-row insert as a listing of select with
// unions joining them together. So we'll build out this list of columns and
// then join them all together with select unions to complete the queries.
insert() {
const insertValues = this.single.insert || [];
let sql = this.with() + `insert into ${this.tableName} `;
if (Array.isArray(insertValues)) {
if (insertValues.length === 0) {
return '';
} else if (
insertValues.length === 1 &&
insertValues[0] &&
isEmpty(insertValues[0])
) {
return {
sql: sql + this._emptyInsertValue,
};
}
} else if (typeof insertValues === 'object' && isEmpty(insertValues)) {
return {
sql: sql + this._emptyInsertValue,
};
}
const insertData = this._prepInsert(insertValues);
if (isString(insertData)) {
return {
sql: sql + insertData,
};
}
if (insertData.columns.length === 0) {
return {
sql: '',
};
}
sql += `(${this.formatter.columnize(insertData.columns)})`;
// backwards compatible error
if (this.client.valueForUndefined !== null) {
insertData.values.forEach((bindings) => {
each(bindings, (binding) => {
if (binding === undefined)
throw new TypeError(
'`sqlite` does not support inserting default values. Specify ' +
'values explicitly or use the `useNullAsDefault` config flag. ' +
'(see docs https://knexjs.org/guide/query-builder.html#insert).'
);
});
});
}
if (insertData.values.length === 1) {
const parameters = this.client.parameterize(
insertData.values[0],
this.client.valueForUndefined,
this.builder,
this.bindingsHolder
);
sql += ` values (${parameters})`;
const { onConflict, ignore, merge } = this.single;
if (onConflict && ignore) sql += this._ignore(onConflict);
else if (onConflict && merge) {
sql += this._merge(merge.updates, onConflict, insertValues);
const wheres = this.where();
if (wheres) sql += ` ${wheres}`;
}
const { returning } = this.single;
if (returning) {
sql += this._returning(returning);
}
return {
sql,
returning,
};
}
const blocks = [];
let i = -1;
while (++i < insertData.values.length) {
let i2 = -1;
const block = (blocks[i] = []);
let current = insertData.values[i];
current = current === undefined ? this.client.valueForUndefined : current;
while (++i2 < insertData.columns.length) {
block.push(
this.client.alias(
this.client.parameter(
current[i2],
this.builder,
this.bindingsHolder
),
this.formatter.wrap(insertData.columns[i2])
)
);
}
blocks[i] = block.join(', ');
}
sql += ' select ' + blocks.join(' union all select ');
const { onConflict, ignore, merge } = this.single;
if (onConflict && ignore) sql += ' where true' + this._ignore(onConflict);
else if (onConflict && merge) {
sql +=
' where true' + this._merge(merge.updates, onConflict, insertValues);
}
const { returning } = this.single;
if (returning) sql += this._returning(returning);
return {
sql,
returning,
};
}
// Compiles an `update` query, allowing for a return value.
update() {
const withSQL = this.with();
const updateData = this._prepUpdate(this.single.update);
const wheres = this.where();
const { returning } = this.single;
return {
sql:
withSQL +
`update ${this.single.only ? 'only ' : ''}${this.tableName} ` +
`set ${updateData.join(', ')}` +
(wheres ? ` ${wheres}` : '') +
this._returning(returning),
returning,
};
}
_ignore(columns) {
if (columns === true) {
return ' on conflict do nothing';
}
return ` on conflict ${this._onConflictClause(columns)} do nothing`;
}
_merge(updates, columns, insert) {
let sql = ` on conflict ${this._onConflictClause(columns)} do update set `;
if (updates && Array.isArray(updates)) {
sql += updates
.map((column) =>
wrapString(
column.split('.').pop(),
this.formatter.builder,
this.client,
this.formatter
)
)
.map((column) => `${column} = excluded.${column}`)
.join(', ');
return sql;
} else if (updates && typeof updates === 'object') {
const updateData = this._prepUpdate(updates);
if (typeof updateData === 'string') {
sql += updateData;
} else {
sql += updateData.join(',');
}
return sql;
} else {
const insertData = this._prepInsert(insert);
if (typeof insertData === 'string') {
throw new Error(
'If using merge with a raw insert query, then updates must be provided'
);
}
sql += insertData.columns
.map((column) =>
wrapString(column.split('.').pop(), this.builder, this.client)
)
.map((column) => `${column} = excluded.${column}`)
.join(', ');
return sql;
}
}
_returning(value) {
return value ? ` returning ${this.formatter.columnize(value)}` : '';
}
// Compile a truncate table statement into SQL.
truncate() {
const { table } = this.single;
return {
sql: `delete from ${this.tableName}`,
output() {
return this.query({
sql: `delete from sqlite_sequence where name = '${table}'`,
}).catch(noop);
},
};
}
// Compiles a `columnInfo` query
columnInfo() {
const column = this.single.columnInfo;
// The user may have specified a custom wrapIdentifier function in the config. We
// need to run the identifiers through that function, but not format them as
// identifiers otherwise.
const table = this.client.customWrapIdentifier(this.single.table, identity);
return {
sql: `PRAGMA table_info(\`${table}\`)`,
output(resp) {
const maxLengthRegex = /.*\((\d+)\)/;
const out = reduce(
resp,
function (columns, val) {
let { type } = val;
let maxLength = type.match(maxLengthRegex);
if (maxLength) {
maxLength = maxLength[1];
}
type = maxLength ? type.split('(')[0] : type;
columns[val.name] = {
type: type.toLowerCase(),
maxLength,
nullable: !val.notnull,
defaultValue: val.dflt_value,
};
return columns;
},
{}
);
return (column && out[column]) || out;
},
};
}
limit() {
const noLimit = !this.single.limit && this.single.limit !== 0;
if (noLimit && !this.single.offset) return '';
// Workaround for offset only,
// see http://stackoverflow.com/questions/10491492/sqllite-with-skip-offset-only-not-limit
this.single.limit = noLimit ? -1 : this.single.limit;
return `limit ${this._getValueOrParameterFromAttribute('limit')}`;
}
// Json functions
jsonExtract(params) {
return this._jsonExtract('json_extract', params);
}
jsonSet(params) {
return this._jsonSet('json_set', params);
}
jsonInsert(params) {
return this._jsonSet('json_insert', params);
}
jsonRemove(params) {
const jsonCol = `json_remove(${columnize_(
params.column,
this.builder,
this.client,
this.bindingsHolder
)},${this.client.parameter(
params.path,
this.builder,
this.bindingsHolder
)})`;
return params.alias
? this.client.alias(jsonCol, this.formatter.wrap(params.alias))
: jsonCol;
}
whereJsonPath(statement) {
return this._whereJsonPath('json_extract', statement);
}
whereJsonSupersetOf(statement) {
throw new Error(
'Json superset where clause not actually supported by SQLite'
);
}
whereJsonSubsetOf(statement) {
throw new Error(
'Json subset where clause not actually supported by SQLite'
);
}
onJsonPathEquals(clause) {
return this._onJsonPathEquals('json_extract', clause);
}
}
module.exports = QueryCompiler_SQLite3;

View File

@ -0,0 +1,400 @@
// SQLite3_DDL
//
// All of the SQLite3 specific DDL helpers for renaming/dropping
// columns and changing datatypes.
// -------
const identity = require('lodash/identity');
const { nanonum } = require('../../../util/nanoid');
const {
copyData,
dropOriginal,
renameTable,
getTableSql,
isForeignCheckEnabled,
setForeignCheck,
executeForeignCheck,
} = require('./internal/sqlite-ddl-operations');
const { parseCreateTable, parseCreateIndex } = require('./internal/parser');
const {
compileCreateTable,
compileCreateIndex,
} = require('./internal/compiler');
const { isEqualId, includesId } = require('./internal/utils');
// So altering the schema in SQLite3 is a major pain.
// We have our own object to deal with the renaming and altering the types
// for sqlite3 things.
class SQLite3_DDL {
constructor(client, tableCompiler, pragma, connection) {
this.client = client;
this.tableCompiler = tableCompiler;
this.pragma = pragma;
this.tableNameRaw = this.tableCompiler.tableNameRaw;
this.alteredName = `_knex_temp_alter${nanonum(3)}`;
this.connection = connection;
this.formatter = (value) =>
this.client.customWrapIdentifier(value, identity);
this.wrap = (value) => this.client.wrapIdentifierImpl(value);
}
tableName() {
return this.formatter(this.tableNameRaw);
}
getTableSql() {
const tableName = this.tableName();
return this.client.transaction(
async (trx) => {
trx.disableProcessing();
const result = await trx.raw(getTableSql(tableName));
trx.enableProcessing();
return {
createTable: result.filter((create) => create.type === 'table')[0]
.sql,
createIndices: result
.filter((create) => create.type === 'index')
.map((create) => create.sql),
};
},
{ connection: this.connection }
);
}
async isForeignCheckEnabled() {
const result = await this.client
.raw(isForeignCheckEnabled())
.connection(this.connection);
return result[0].foreign_keys === 1;
}
async setForeignCheck(enable) {
await this.client.raw(setForeignCheck(enable)).connection(this.connection);
}
renameTable(trx) {
return trx.raw(renameTable(this.alteredName, this.tableName()));
}
dropOriginal(trx) {
return trx.raw(dropOriginal(this.tableName()));
}
copyData(trx, columns) {
return trx.raw(copyData(this.tableName(), this.alteredName, columns));
}
async alterColumn(columns) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
parsedTable.columns = parsedTable.columns.map((column) => {
const newColumnInfo = columns.find((c) => isEqualId(c.name, column.name));
if (newColumnInfo) {
column.type = newColumnInfo.type;
column.constraints.default =
newColumnInfo.defaultTo !== null
? {
name: null,
value: newColumnInfo.defaultTo,
expression: false,
}
: null;
column.constraints.notnull = newColumnInfo.notNull
? { name: null, conflict: null }
: null;
column.constraints.null = newColumnInfo.notNull
? null
: column.constraints.null;
}
return column;
});
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.generateAlterCommands(newTable, createIndices);
}
async dropColumn(columns) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
parsedTable.columns = parsedTable.columns.filter(
(parsedColumn) =>
parsedColumn.expression || !includesId(columns, parsedColumn.name)
);
if (parsedTable.columns.length === 0) {
throw new Error('Unable to drop last column from table');
}
parsedTable.constraints = parsedTable.constraints.filter((constraint) => {
if (constraint.type === 'PRIMARY KEY' || constraint.type === 'UNIQUE') {
return constraint.columns.every(
(constraintColumn) =>
constraintColumn.expression ||
!includesId(columns, constraintColumn.name)
);
} else if (constraint.type === 'FOREIGN KEY') {
return (
constraint.columns.every(
(constraintColumnName) => !includesId(columns, constraintColumnName)
) &&
(constraint.references.table !== parsedTable.table ||
constraint.references.columns.every(
(referenceColumnName) => !includesId(columns, referenceColumnName)
))
);
} else {
return true;
}
});
const newColumns = parsedTable.columns.map((column) => column.name);
const newTable = compileCreateTable(parsedTable, this.wrap);
const newIndices = [];
for (const createIndex of createIndices) {
const parsedIndex = parseCreateIndex(createIndex);
parsedIndex.columns = parsedIndex.columns.filter(
(parsedColumn) =>
parsedColumn.expression || !includesId(columns, parsedColumn.name)
);
if (parsedIndex.columns.length > 0) {
newIndices.push(compileCreateIndex(parsedIndex, this.wrap));
}
}
return this.alter(newTable, newIndices, newColumns);
}
async dropForeign(columns, foreignKeyName) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
if (!foreignKeyName) {
parsedTable.columns = parsedTable.columns.map((column) => ({
...column,
references: includesId(columns, column.name) ? null : column.references,
}));
}
parsedTable.constraints = parsedTable.constraints.filter((constraint) => {
if (constraint.type === 'FOREIGN KEY') {
if (foreignKeyName) {
return (
!constraint.name || !isEqualId(constraint.name, foreignKeyName)
);
}
return constraint.columns.every(
(constraintColumnName) => !includesId(columns, constraintColumnName)
);
} else {
return true;
}
});
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.alter(newTable, createIndices);
}
async dropPrimary(constraintName) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
parsedTable.columns = parsedTable.columns.map((column) => ({
...column,
primary: null,
}));
parsedTable.constraints = parsedTable.constraints.filter((constraint) => {
if (constraint.type === 'PRIMARY KEY') {
if (constraintName) {
return (
!constraint.name || !isEqualId(constraint.name, constraintName)
);
} else {
return false;
}
} else {
return true;
}
});
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.alter(newTable, createIndices);
}
async primary(columns, constraintName) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
parsedTable.columns = parsedTable.columns.map((column) => ({
...column,
primary: null,
}));
parsedTable.constraints = parsedTable.constraints.filter(
(constraint) => constraint.type !== 'PRIMARY KEY'
);
parsedTable.constraints.push({
type: 'PRIMARY KEY',
name: constraintName || null,
columns: columns.map((column) => ({
name: column,
expression: false,
collation: null,
order: null,
})),
conflict: null,
});
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.alter(newTable, createIndices);
}
async foreign(foreignInfo) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
parsedTable.constraints.push({
type: 'FOREIGN KEY',
name: foreignInfo.keyName || null,
columns: foreignInfo.column,
references: {
table: foreignInfo.inTable,
columns: foreignInfo.references,
delete: foreignInfo.onDelete || null,
update: foreignInfo.onUpdate || null,
match: null,
deferrable: null,
},
});
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.generateAlterCommands(newTable, createIndices);
}
async setNullable(column, isNullable) {
const { createTable, createIndices } = await this.getTableSql();
const parsedTable = parseCreateTable(createTable);
parsedTable.table = this.alteredName;
const parsedColumn = parsedTable.columns.find((c) =>
isEqualId(column, c.name)
);
if (!parsedColumn) {
throw new Error(
`.setNullable: Column ${column} does not exist in table ${this.tableName()}.`
);
}
parsedColumn.constraints.notnull = isNullable
? null
: { name: null, conflict: null };
parsedColumn.constraints.null = isNullable
? parsedColumn.constraints.null
: null;
const newTable = compileCreateTable(parsedTable, this.wrap);
return this.generateAlterCommands(newTable, createIndices);
}
async alter(newSql, createIndices, columns) {
const wasForeignCheckEnabled = await this.isForeignCheckEnabled();
if (wasForeignCheckEnabled) {
await this.setForeignCheck(false);
}
try {
await this.client.transaction(
async (trx) => {
await trx.raw(newSql);
await this.copyData(trx, columns);
await this.dropOriginal(trx);
await this.renameTable(trx);
for (const createIndex of createIndices) {
await trx.raw(createIndex);
}
if (wasForeignCheckEnabled) {
const foreignViolations = await trx.raw(executeForeignCheck());
if (foreignViolations.length > 0) {
throw new Error('FOREIGN KEY constraint failed');
}
}
},
{ connection: this.connection }
);
} finally {
if (wasForeignCheckEnabled) {
await this.setForeignCheck(true);
}
}
}
async generateAlterCommands(newSql, createIndices, columns) {
const sql = [];
const pre = [];
const post = [];
let check = null;
sql.push(newSql);
sql.push(copyData(this.tableName(), this.alteredName, columns));
sql.push(dropOriginal(this.tableName()));
sql.push(renameTable(this.alteredName, this.tableName()));
for (const createIndex of createIndices) {
sql.push(createIndex);
}
const isForeignCheckEnabled = await this.isForeignCheckEnabled();
if (isForeignCheckEnabled) {
pre.push(setForeignCheck(false));
post.push(setForeignCheck(true));
check = executeForeignCheck();
}
return { pre, sql, check, post };
}
}
module.exports = SQLite3_DDL;

View File

@ -0,0 +1,327 @@
function compileCreateTable(ast, wrap = (v) => v) {
return createTable(ast, wrap);
}
function compileCreateIndex(ast, wrap = (v) => v) {
return createIndex(ast, wrap);
}
function createTable(ast, wrap) {
return `CREATE${temporary(ast, wrap)} TABLE${exists(ast, wrap)} ${schema(
ast,
wrap
)}${table(ast, wrap)} (${columnDefinitionList(
ast,
wrap
)}${tableConstraintList(ast, wrap)})${rowid(ast, wrap)}`;
}
function temporary(ast, wrap) {
return ast.temporary ? ' TEMP' : '';
}
function rowid(ast, wrap) {
return ast.rowid ? ' WITHOUT ROWID' : '';
}
function columnDefinitionList(ast, wrap) {
return ast.columns.map((column) => columnDefinition(column, wrap)).join(', ');
}
function columnDefinition(ast, wrap) {
return `${identifier(ast.name, wrap)}${typeName(
ast,
wrap
)}${columnConstraintList(ast.constraints, wrap)}`;
}
function typeName(ast, wrap) {
return ast.type !== null ? ` ${ast.type}` : '';
}
function columnConstraintList(ast, wrap) {
return `${primaryColumnConstraint(ast, wrap)}${notnullColumnConstraint(
ast,
wrap
)}${nullColumnConstraint(ast, wrap)}${uniqueColumnConstraint(
ast,
wrap
)}${checkColumnConstraint(ast, wrap)}${defaultColumnConstraint(
ast,
wrap
)}${collateColumnConstraint(ast, wrap)}${referencesColumnConstraint(
ast,
wrap
)}${asColumnConstraint(ast, wrap)}`;
}
function primaryColumnConstraint(ast, wrap) {
return ast.primary !== null
? ` ${constraintName(ast.primary, wrap)}PRIMARY KEY${order(
ast.primary,
wrap
)}${conflictClause(ast.primary, wrap)}${autoincrement(ast.primary, wrap)}`
: '';
}
function autoincrement(ast, wrap) {
return ast.autoincrement ? ' AUTOINCREMENT' : '';
}
function notnullColumnConstraint(ast, wrap) {
return ast.notnull !== null
? ` ${constraintName(ast.notnull, wrap)}NOT NULL${conflictClause(
ast.notnull,
wrap
)}`
: '';
}
function nullColumnConstraint(ast, wrap) {
return ast.null !== null
? ` ${constraintName(ast.null, wrap)}NULL${conflictClause(ast.null, wrap)}`
: '';
}
function uniqueColumnConstraint(ast, wrap) {
return ast.unique !== null
? ` ${constraintName(ast.unique, wrap)}UNIQUE${conflictClause(
ast.unique,
wrap
)}`
: '';
}
function checkColumnConstraint(ast, wrap) {
return ast.check !== null
? ` ${constraintName(ast.check, wrap)}CHECK (${expression(
ast.check.expression,
wrap
)})`
: '';
}
function defaultColumnConstraint(ast, wrap) {
return ast.default !== null
? ` ${constraintName(ast.default, wrap)}DEFAULT ${
!ast.default.expression
? ast.default.value
: `(${expression(ast.default.value, wrap)})`
}`
: '';
}
function collateColumnConstraint(ast, wrap) {
return ast.collate !== null
? ` ${constraintName(ast.collate, wrap)}COLLATE ${ast.collate.collation}`
: '';
}
function referencesColumnConstraint(ast, wrap) {
return ast.references !== null
? ` ${constraintName(ast.references, wrap)}${foreignKeyClause(
ast.references,
wrap
)}`
: '';
}
function asColumnConstraint(ast, wrap) {
return ast.as !== null
? ` ${constraintName(ast.as, wrap)}${
ast.as.generated ? 'GENERATED ALWAYS ' : ''
}AS (${expression(ast.as.expression, wrap)})${
ast.as.mode !== null ? ` ${ast.as.mode}` : ''
}`
: '';
}
function tableConstraintList(ast, wrap) {
return ast.constraints.reduce(
(constraintList, constraint) =>
`${constraintList}, ${tableConstraint(constraint, wrap)}`,
''
);
}
function tableConstraint(ast, wrap) {
switch (ast.type) {
case 'PRIMARY KEY':
return primaryTableConstraint(ast, wrap);
case 'UNIQUE':
return uniqueTableConstraint(ast, wrap);
case 'CHECK':
return checkTableConstraint(ast, wrap);
case 'FOREIGN KEY':
return foreignTableConstraint(ast, wrap);
}
}
function primaryTableConstraint(ast, wrap) {
return `${constraintName(ast, wrap)}PRIMARY KEY (${indexedColumnList(
ast,
wrap
)})${conflictClause(ast, wrap)}`;
}
function uniqueTableConstraint(ast, wrap) {
return `${constraintName(ast, wrap)}UNIQUE (${indexedColumnList(
ast,
wrap
)})${conflictClause(ast, wrap)}`;
}
function conflictClause(ast, wrap) {
return ast.conflict !== null ? ` ON CONFLICT ${ast.conflict}` : '';
}
function checkTableConstraint(ast, wrap) {
return `${constraintName(ast, wrap)}CHECK (${expression(
ast.expression,
wrap
)})`;
}
function foreignTableConstraint(ast, wrap) {
return `${constraintName(ast, wrap)}FOREIGN KEY (${columnNameList(
ast,
wrap
)}) ${foreignKeyClause(ast.references, wrap)}`;
}
function foreignKeyClause(ast, wrap) {
return `REFERENCES ${table(ast, wrap)}${columnNameListOptional(
ast,
wrap
)}${deleteUpdateMatchList(ast, wrap)}${deferrable(ast.deferrable, wrap)}`;
}
function columnNameListOptional(ast, wrap) {
return ast.columns.length > 0 ? ` (${columnNameList(ast, wrap)})` : '';
}
function columnNameList(ast, wrap) {
return ast.columns.map((column) => identifier(column, wrap)).join(', ');
}
function deleteUpdateMatchList(ast, wrap) {
return `${deleteReference(ast, wrap)}${updateReference(
ast,
wrap
)}${matchReference(ast, wrap)}`;
}
function deleteReference(ast, wrap) {
return ast.delete !== null ? ` ON DELETE ${ast.delete}` : '';
}
function updateReference(ast, wrap) {
return ast.update !== null ? ` ON UPDATE ${ast.update}` : '';
}
function matchReference(ast, wrap) {
return ast.match !== null ? ` MATCH ${ast.match}` : '';
}
function deferrable(ast, wrap) {
return ast !== null
? ` ${ast.not ? 'NOT ' : ''}DEFERRABLE${
ast.initially !== null ? ` INITIALLY ${ast.initially}` : ''
}`
: '';
}
function constraintName(ast, wrap) {
return ast.name !== null ? `CONSTRAINT ${identifier(ast.name, wrap)} ` : '';
}
function createIndex(ast, wrap) {
return `CREATE${unique(ast, wrap)} INDEX${exists(ast, wrap)} ${schema(
ast,
wrap
)}${index(ast, wrap)} on ${table(ast, wrap)} (${indexedColumnList(
ast,
wrap
)})${where(ast, wrap)}`;
}
function unique(ast, wrap) {
return ast.unique ? ' UNIQUE' : '';
}
function exists(ast, wrap) {
return ast.exists ? ' IF NOT EXISTS' : '';
}
function schema(ast, wrap) {
return ast.schema !== null ? `${identifier(ast.schema, wrap)}.` : '';
}
function index(ast, wrap) {
return identifier(ast.index, wrap);
}
function table(ast, wrap) {
return identifier(ast.table, wrap);
}
function where(ast, wrap) {
return ast.where !== null ? ` where ${expression(ast.where)}` : '';
}
function indexedColumnList(ast, wrap) {
return ast.columns
.map((column) =>
!column.expression
? indexedColumn(column, wrap)
: indexedColumnExpression(column, wrap)
)
.join(', ');
}
function indexedColumn(ast, wrap) {
return `${identifier(ast.name, wrap)}${collation(ast, wrap)}${order(
ast,
wrap
)}`;
}
function indexedColumnExpression(ast, wrap) {
return `${indexedExpression(ast.name, wrap)}${collation(ast, wrap)}${order(
ast,
wrap
)}`;
}
function collation(ast, wrap) {
return ast.collation !== null ? ` COLLATE ${ast.collation}` : '';
}
function order(ast, wrap) {
return ast.order !== null ? ` ${ast.order}` : '';
}
function indexedExpression(ast, wrap) {
return expression(ast, wrap);
}
function expression(ast, wrap) {
return ast.reduce(
(expr, e) =>
Array.isArray(e)
? `${expr}(${expression(e)})`
: !expr
? e
: `${expr} ${e}`,
''
);
}
function identifier(ast, wrap) {
return wrap(ast);
}
module.exports = {
compileCreateTable,
compileCreateIndex,
};

View File

@ -0,0 +1,161 @@
// Sequence parser combinator
function s(sequence, post = (v) => v) {
return function ({ index = 0, input }) {
let position = index;
const ast = [];
for (const parser of sequence) {
const result = parser({ index: position, input });
if (result.success) {
position = result.index;
ast.push(result.ast);
} else {
return result;
}
}
return { success: true, ast: post(ast), index: position, input };
};
}
// Alternative parser combinator
function a(alternative, post = (v) => v) {
return function ({ index = 0, input }) {
for (const parser of alternative) {
const result = parser({ index, input });
if (result.success) {
return {
success: true,
ast: post(result.ast),
index: result.index,
input,
};
}
}
return { success: false, ast: null, index, input };
};
}
// Many parser combinator
function m(many, post = (v) => v) {
return function ({ index = 0, input }) {
let result = {};
let position = index;
const ast = [];
do {
result = many({ index: position, input });
if (result.success) {
position = result.index;
ast.push(result.ast);
}
} while (result.success);
if (ast.length > 0) {
return { success: true, ast: post(ast), index: position, input };
} else {
return { success: false, ast: null, index: position, input };
}
};
}
// Optional parser combinator
function o(optional, post = (v) => v) {
return function ({ index = 0, input }) {
const result = optional({ index, input });
if (result.success) {
return {
success: true,
ast: post(result.ast),
index: result.index,
input,
};
} else {
return { success: true, ast: post(null), index, input };
}
};
}
// Lookahead parser combinator
function l(lookahead, post = (v) => v) {
return function ({ index = 0, input }) {
const result = lookahead.do({ index, input });
if (result.success) {
const resultNext = lookahead.next({ index: result.index, input });
if (resultNext.success) {
return {
success: true,
ast: post(result.ast),
index: result.index,
input,
};
}
}
return { success: false, ast: null, index, input };
};
}
// Negative parser combinator
function n(negative, post = (v) => v) {
return function ({ index = 0, input }) {
const result = negative.do({ index, input });
if (result.success) {
const resultNot = negative.not({ index, input });
if (!resultNot.success) {
return {
success: true,
ast: post(result.ast),
index: result.index,
input,
};
}
}
return { success: false, ast: null, index, input };
};
}
// Token parser combinator
function t(token, post = (v) => v.text) {
return function ({ index = 0, input }) {
const result = input[index];
if (
result !== undefined &&
(token.type === undefined || token.type === result.type) &&
(token.text === undefined ||
token.text.toUpperCase() === result.text.toUpperCase())
) {
return {
success: true,
ast: post(result),
index: index + 1,
input,
};
} else {
return { success: false, ast: null, index, input };
}
};
}
// Empty parser constant
const e = function ({ index = 0, input }) {
return { success: true, ast: null, index, input };
};
// Finish parser constant
const f = function ({ index = 0, input }) {
return { success: index === input.length, ast: null, index, input };
};
module.exports = { s, a, m, o, l, n, t, e, f };

View File

@ -0,0 +1,638 @@
const { tokenize } = require('./tokenizer');
const { s, a, m, o, l, n, t, e, f } = require('./parser-combinator');
const TOKENS = {
keyword:
/(?:ABORT|ACTION|ADD|AFTER|ALL|ALTER|ALWAYS|ANALYZE|AND|AS|ASC|ATTACH|AUTOINCREMENT|BEFORE|BEGIN|BETWEEN|BY|CASCADE|CASE|CAST|CHECK|COLLATE|COLUMN|COMMIT|CONFLICT|CONSTRAINT|CREATE|CROSS|CURRENT|CURRENT_DATE|CURRENT_TIME|CURRENT_TIMESTAMP|DATABASE|DEFAULT|DEFERRED|DEFERRABLE|DELETE|DESC|DETACH|DISTINCT|DO|DROP|END|EACH|ELSE|ESCAPE|EXCEPT|EXCLUSIVE|EXCLUDE|EXISTS|EXPLAIN|FAIL|FILTER|FIRST|FOLLOWING|FOR|FOREIGN|FROM|FULL|GENERATED|GLOB|GROUP|GROUPS|HAVING|IF|IGNORE|IMMEDIATE|IN|INDEX|INDEXED|INITIALLY|INNER|INSERT|INSTEAD|INTERSECT|INTO|IS|ISNULL|JOIN|KEY|LAST|LEFT|LIKE|LIMIT|MATCH|MATERIALIZED|NATURAL|NO|NOT|NOTHING|NOTNULL|NULL|NULLS|OF|OFFSET|ON|OR|ORDER|OTHERS|OUTER|OVER|PARTITION|PLAN|PRAGMA|PRECEDING|PRIMARY|QUERY|RAISE|RANGE|RECURSIVE|REFERENCES|REGEXP|REINDEX|RELEASE|RENAME|REPLACE|RESTRICT|RETURNING|RIGHT|ROLLBACK|ROW|ROWS|SAVEPOINT|SELECT|SET|TABLE|TEMP|TEMPORARY|THEN|TIES|TO|TRANSACTION|TRIGGER|UNBOUNDED|UNION|UNIQUE|UPDATE|USING|VACUUM|VALUES|VIEW|VIRTUAL|WHEN|WHERE|WINDOW|WITH|WITHOUT)(?=\s+|-|\(|\)|;|\+|\*|\/|%|==|=|<=|<>|<<|<|>=|>>|>|!=|,|&|~|\|\||\||\.)/,
id: /"[^"]*(?:""[^"]*)*"|`[^`]*(?:``[^`]*)*`|\[[^[\]]*\]|[a-z_][a-z0-9_$]*/,
string: /'[^']*(?:''[^']*)*'/,
blob: /x'(?:[0-9a-f][0-9a-f])+'/,
numeric: /(?:\d+(?:\.\d*)?|\.\d+)(?:e(?:\+|-)?\d+)?|0x[0-9a-f]+/,
variable: /\?\d*|[@$:][a-z0-9_$]+/,
operator: /-|\(|\)|;|\+|\*|\/|%|==|=|<=|<>|<<|<|>=|>>|>|!=|,|&|~|\|\||\||\./,
_ws: /\s+/,
};
function parseCreateTable(sql) {
const result = createTable({ input: tokenize(sql, TOKENS) });
if (!result.success) {
throw new Error(
`Parsing CREATE TABLE failed at [${result.input
.slice(result.index)
.map((t) => t.text)
.join(' ')}] of "${sql}"`
);
}
return result.ast;
}
function parseCreateIndex(sql) {
const result = createIndex({ input: tokenize(sql, TOKENS) });
if (!result.success) {
throw new Error(
`Parsing CREATE INDEX failed at [${result.input
.slice(result.index)
.map((t) => t.text)
.join(' ')}] of "${sql}"`
);
}
return result.ast;
}
function createTable(ctx) {
return s(
[
t({ text: 'CREATE' }, (v) => null),
temporary,
t({ text: 'TABLE' }, (v) => null),
exists,
schema,
table,
t({ text: '(' }, (v) => null),
columnDefinitionList,
tableConstraintList,
t({ text: ')' }, (v) => null),
rowid,
f,
],
(v) => Object.assign({}, ...v.filter((x) => x !== null))
)(ctx);
}
function temporary(ctx) {
return a([t({ text: 'TEMP' }), t({ text: 'TEMPORARY' }), e], (v) => ({
temporary: v !== null,
}))(ctx);
}
function rowid(ctx) {
return o(s([t({ text: 'WITHOUT' }), t({ text: 'ROWID' })]), (v) => ({
rowid: v !== null,
}))(ctx);
}
function columnDefinitionList(ctx) {
return a([
s([columnDefinition, t({ text: ',' }), columnDefinitionList], (v) => ({
columns: [v[0]].concat(v[2].columns),
})),
s([columnDefinition], (v) => ({ columns: [v[0]] })),
])(ctx);
}
function columnDefinition(ctx) {
return s(
[s([identifier], (v) => ({ name: v[0] })), typeName, columnConstraintList],
(v) => Object.assign({}, ...v)
)(ctx);
}
function typeName(ctx) {
return o(
s(
[
m(t({ type: 'id' })),
a([
s(
[
t({ text: '(' }),
signedNumber,
t({ text: ',' }),
signedNumber,
t({ text: ')' }),
],
(v) => `(${v[1]}, ${v[3]})`
),
s(
[t({ text: '(' }), signedNumber, t({ text: ')' })],
(v) => `(${v[1]})`
),
e,
]),
],
(v) => `${v[0].join(' ')}${v[1] || ''}`
),
(v) => ({ type: v })
)(ctx);
}
function columnConstraintList(ctx) {
return o(m(columnConstraint), (v) => ({
constraints: Object.assign(
{
primary: null,
notnull: null,
null: null,
unique: null,
check: null,
default: null,
collate: null,
references: null,
as: null,
},
...(v || [])
),
}))(ctx);
}
function columnConstraint(ctx) {
return a([
primaryColumnConstraint,
notnullColumnConstraint,
nullColumnConstraint,
uniqueColumnConstraint,
checkColumnConstraint,
defaultColumnConstraint,
collateColumnConstraint,
referencesColumnConstraint,
asColumnConstraint,
])(ctx);
}
function primaryColumnConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'PRIMARY' }, (v) => null),
t({ text: 'KEY' }, (v) => null),
order,
conflictClause,
autoincrement,
],
(v) => ({ primary: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function autoincrement(ctx) {
return o(t({ text: 'AUTOINCREMENT' }), (v) => ({
autoincrement: v !== null,
}))(ctx);
}
function notnullColumnConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'NOT' }, (v) => null),
t({ text: 'NULL' }, (v) => null),
conflictClause,
],
(v) => ({ notnull: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function nullColumnConstraint(ctx) {
return s(
[constraintName, t({ text: 'NULL' }, (v) => null), conflictClause],
(v) => ({ null: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function uniqueColumnConstraint(ctx) {
return s(
[constraintName, t({ text: 'UNIQUE' }, (v) => null), conflictClause],
(v) => ({ unique: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function checkColumnConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'CHECK' }, (v) => null),
t({ text: '(' }, (v) => null),
s([expression], (v) => ({ expression: v[0] })),
t({ text: ')' }, (v) => null),
],
(v) => ({ check: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function defaultColumnConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'DEFAULT' }, (v) => null),
a([
s([t({ text: '(' }), expression, t({ text: ')' })], (v) => ({
value: v[1],
expression: true,
})),
s([literalValue], (v) => ({ value: v[0], expression: false })),
s([signedNumber], (v) => ({ value: v[0], expression: false })),
]),
],
(v) => ({ default: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function collateColumnConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'COLLATE' }, (v) => null),
t({ type: 'id' }, (v) => ({ collation: v.text })),
],
(v) => ({ collate: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function referencesColumnConstraint(ctx) {
return s(
[constraintName, s([foreignKeyClause], (v) => v[0].references)],
(v) => ({
references: Object.assign({}, ...v.filter((x) => x !== null)),
})
)(ctx);
}
function asColumnConstraint(ctx) {
return s(
[
constraintName,
o(s([t({ text: 'GENERATED' }), t({ text: 'ALWAYS' })]), (v) => ({
generated: v !== null,
})),
t({ text: 'AS' }, (v) => null),
t({ text: '(' }, (v) => null),
s([expression], (v) => ({ expression: v[0] })),
t({ text: ')' }, (v) => null),
a([t({ text: 'STORED' }), t({ text: 'VIRTUAL' }), e], (v) => ({
mode: v ? v.toUpperCase() : null,
})),
],
(v) => ({ as: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function tableConstraintList(ctx) {
return o(m(s([t({ text: ',' }), tableConstraint], (v) => v[1])), (v) => ({
constraints: v || [],
}))(ctx);
}
function tableConstraint(ctx) {
return a([
primaryTableConstraint,
uniqueTableConstraint,
checkTableConstraint,
foreignTableConstraint,
])(ctx);
}
function primaryTableConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'PRIMARY' }, (v) => null),
t({ text: 'KEY' }, (v) => null),
t({ text: '(' }, (v) => null),
indexedColumnList,
t({ text: ')' }, (v) => null),
conflictClause,
],
(v) =>
Object.assign({ type: 'PRIMARY KEY' }, ...v.filter((x) => x !== null))
)(ctx);
}
function uniqueTableConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'UNIQUE' }, (v) => null),
t({ text: '(' }, (v) => null),
indexedColumnList,
t({ text: ')' }, (v) => null),
conflictClause,
],
(v) => Object.assign({ type: 'UNIQUE' }, ...v.filter((x) => x !== null))
)(ctx);
}
function conflictClause(ctx) {
return o(
s(
[
t({ text: 'ON' }),
t({ text: 'CONFLICT' }),
a([
t({ text: 'ROLLBACK' }),
t({ text: 'ABORT' }),
t({ text: 'FAIL' }),
t({ text: 'IGNORE' }),
t({ text: 'REPLACE' }),
]),
],
(v) => v[2]
),
(v) => ({ conflict: v ? v.toUpperCase() : null })
)(ctx);
}
function checkTableConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'CHECK' }, (v) => null),
t({ text: '(' }, (v) => null),
s([expression], (v) => ({ expression: v[0] })),
t({ text: ')' }, (v) => null),
],
(v) => Object.assign({ type: 'CHECK' }, ...v.filter((x) => x !== null))
)(ctx);
}
function foreignTableConstraint(ctx) {
return s(
[
constraintName,
t({ text: 'FOREIGN' }, (v) => null),
t({ text: 'KEY' }, (v) => null),
t({ text: '(' }, (v) => null),
columnNameList,
t({ text: ')' }, (v) => null),
foreignKeyClause,
],
(v) =>
Object.assign({ type: 'FOREIGN KEY' }, ...v.filter((x) => x !== null))
)(ctx);
}
function foreignKeyClause(ctx) {
return s(
[
t({ text: 'REFERENCES' }, (v) => null),
table,
columnNameListOptional,
o(m(a([deleteReference, updateReference, matchReference])), (v) =>
Object.assign({ delete: null, update: null, match: null }, ...(v || []))
),
deferrable,
],
(v) => ({ references: Object.assign({}, ...v.filter((x) => x !== null)) })
)(ctx);
}
function columnNameListOptional(ctx) {
return o(
s([t({ text: '(' }), columnNameList, t({ text: ')' })], (v) => v[1]),
(v) => ({ columns: v ? v.columns : [] })
)(ctx);
}
function columnNameList(ctx) {
return s(
[
o(m(s([identifier, t({ text: ',' })], (v) => v[0])), (v) =>
v !== null ? v : []
),
identifier,
],
(v) => ({ columns: v[0].concat([v[1]]) })
)(ctx);
}
function deleteReference(ctx) {
return s([t({ text: 'ON' }), t({ text: 'DELETE' }), onAction], (v) => ({
delete: v[2],
}))(ctx);
}
function updateReference(ctx) {
return s([t({ text: 'ON' }), t({ text: 'UPDATE' }), onAction], (v) => ({
update: v[2],
}))(ctx);
}
function matchReference(ctx) {
return s(
[t({ text: 'MATCH' }), a([t({ type: 'keyword' }), t({ type: 'id' })])],
(v) => ({ match: v[1] })
)(ctx);
}
function deferrable(ctx) {
return o(
s([
o(t({ text: 'NOT' })),
t({ text: 'DEFERRABLE' }),
o(
s(
[
t({ text: 'INITIALLY' }),
a([t({ text: 'DEFERRED' }), t({ text: 'IMMEDIATE' })]),
],
(v) => v[1].toUpperCase()
)
),
]),
(v) => ({ deferrable: v ? { not: v[0] !== null, initially: v[2] } : null })
)(ctx);
}
function constraintName(ctx) {
return o(
s([t({ text: 'CONSTRAINT' }), identifier], (v) => v[1]),
(v) => ({ name: v })
)(ctx);
}
function createIndex(ctx) {
return s(
[
t({ text: 'CREATE' }, (v) => null),
unique,
t({ text: 'INDEX' }, (v) => null),
exists,
schema,
index,
t({ text: 'ON' }, (v) => null),
table,
t({ text: '(' }, (v) => null),
indexedColumnList,
t({ text: ')' }, (v) => null),
where,
f,
],
(v) => Object.assign({}, ...v.filter((x) => x !== null))
)(ctx);
}
function unique(ctx) {
return o(t({ text: 'UNIQUE' }), (v) => ({ unique: v !== null }))(ctx);
}
function exists(ctx) {
return o(
s([t({ text: 'IF' }), t({ text: 'NOT' }), t({ text: 'EXISTS' })]),
(v) => ({ exists: v !== null })
)(ctx);
}
function schema(ctx) {
return o(
s([identifier, t({ text: '.' })], (v) => v[0]),
(v) => ({ schema: v })
)(ctx);
}
function index(ctx) {
return s([identifier], (v) => ({ index: v[0] }))(ctx);
}
function table(ctx) {
return s([identifier], (v) => ({ table: v[0] }))(ctx);
}
function where(ctx) {
return o(
s([t({ text: 'WHERE' }), expression], (v) => v[1]),
(v) => ({ where: v })
)(ctx);
}
function indexedColumnList(ctx) {
return a([
s([indexedColumn, t({ text: ',' }), indexedColumnList], (v) => ({
columns: [v[0]].concat(v[2].columns),
})),
s([indexedColumnExpression, t({ text: ',' }), indexedColumnList], (v) => ({
columns: [v[0]].concat(v[2].columns),
})),
l({ do: indexedColumn, next: t({ text: ')' }) }, (v) => ({
columns: [v],
})),
l({ do: indexedColumnExpression, next: t({ text: ')' }) }, (v) => ({
columns: [v],
})),
])(ctx);
}
function indexedColumn(ctx) {
return s(
[
s([identifier], (v) => ({ name: v[0], expression: false })),
collation,
order,
],
(v) => Object.assign({}, ...v.filter((x) => x !== null))
)(ctx);
}
function indexedColumnExpression(ctx) {
return s(
[
s([indexedExpression], (v) => ({ name: v[0], expression: true })),
collation,
order,
],
(v) => Object.assign({}, ...v.filter((x) => x !== null))
)(ctx);
}
function collation(ctx) {
return o(
s([t({ text: 'COLLATE' }), t({ type: 'id' })], (v) => v[1]),
(v) => ({ collation: v })
)(ctx);
}
function order(ctx) {
return a([t({ text: 'ASC' }), t({ text: 'DESC' }), e], (v) => ({
order: v ? v.toUpperCase() : null,
}))(ctx);
}
function indexedExpression(ctx) {
return m(
a([
n({
do: t({ type: 'keyword' }),
not: a([
t({ text: 'COLLATE' }),
t({ text: 'ASC' }),
t({ text: 'DESC' }),
]),
}),
t({ type: 'id' }),
t({ type: 'string' }),
t({ type: 'blob' }),
t({ type: 'numeric' }),
t({ type: 'variable' }),
n({
do: t({ type: 'operator' }),
not: a([t({ text: '(' }), t({ text: ')' }), t({ text: ',' })]),
}),
s([t({ text: '(' }), o(expression), t({ text: ')' })], (v) => v[1] || []),
])
)(ctx);
}
function expression(ctx) {
return m(
a([
t({ type: 'keyword' }),
t({ type: 'id' }),
t({ type: 'string' }),
t({ type: 'blob' }),
t({ type: 'numeric' }),
t({ type: 'variable' }),
n({
do: t({ type: 'operator' }),
not: a([t({ text: '(' }), t({ text: ')' })]),
}),
s([t({ text: '(' }), o(expression), t({ text: ')' })], (v) => v[1] || []),
])
)(ctx);
}
function identifier(ctx) {
return a([t({ type: 'id' }), t({ type: 'string' })], (v) =>
/^["`['][^]*["`\]']$/.test(v) ? v.substring(1, v.length - 1) : v
)(ctx);
}
function onAction(ctx) {
return a(
[
s([t({ text: 'SET' }), t({ text: 'NULL' })], (v) => `${v[0]} ${v[1]}`),
s([t({ text: 'SET' }), t({ text: 'DEFAULT' })], (v) => `${v[0]} ${v[1]}`),
t({ text: 'CASCADE' }),
t({ text: 'RESTRICT' }),
s([t({ text: 'NO' }), t({ text: 'ACTION' })], (v) => `${v[0]} ${v[1]}`),
],
(v) => v.toUpperCase()
)(ctx);
}
function literalValue(ctx) {
return a([
t({ type: 'numeric' }),
t({ type: 'string' }),
t({ type: 'id' }),
t({ type: 'blob' }),
t({ text: 'NULL' }),
t({ text: 'TRUE' }),
t({ text: 'FALSE' }),
t({ text: 'CURRENT_TIME' }),
t({ text: 'CURRENT_DATE' }),
t({ text: 'CURRENT_TIMESTAMP' }),
])(ctx);
}
function signedNumber(ctx) {
return s(
[a([t({ text: '+' }), t({ text: '-' }), e]), t({ type: 'numeric' })],
(v) => `${v[0] || ''}${v[1]}`
)(ctx);
}
module.exports = {
parseCreateTable,
parseCreateIndex,
};

View File

@ -0,0 +1,41 @@
function copyData(sourceTable, targetTable, columns) {
return `INSERT INTO "${targetTable}" SELECT ${
columns === undefined
? '*'
: columns.map((column) => `"${column}"`).join(', ')
} FROM "${sourceTable}";`;
}
function dropOriginal(tableName) {
return `DROP TABLE "${tableName}"`;
}
function renameTable(tableName, alteredName) {
return `ALTER TABLE "${tableName}" RENAME TO "${alteredName}"`;
}
function getTableSql(tableName) {
return `SELECT type, sql FROM sqlite_master WHERE (type='table' OR (type='index' AND sql IS NOT NULL)) AND lower(tbl_name)='${tableName.toLowerCase()}'`;
}
function isForeignCheckEnabled() {
return `PRAGMA foreign_keys`;
}
function setForeignCheck(enable) {
return `PRAGMA foreign_keys = ${enable ? 'ON' : 'OFF'}`;
}
function executeForeignCheck() {
return `PRAGMA foreign_key_check`;
}
module.exports = {
copyData,
dropOriginal,
renameTable,
getTableSql,
isForeignCheckEnabled,
setForeignCheck,
executeForeignCheck,
};

View File

@ -0,0 +1,38 @@
function tokenize(text, tokens) {
const compiledRegex = new RegExp(
Object.entries(tokens)
.map(([type, regex]) => `(?<${type}>${regex.source})`)
.join('|'),
'yi'
);
let index = 0;
const ast = [];
while (index < text.length) {
compiledRegex.lastIndex = index;
const result = text.match(compiledRegex);
if (result !== null) {
const [type, text] = Object.entries(result.groups).find(
([name, group]) => group !== undefined
);
index += text.length;
if (!type.startsWith('_')) {
ast.push({ type, text });
}
} else {
throw new Error(
`No matching tokenizer rule found at: [${text.substring(index)}]`
);
}
}
return ast;
}
module.exports = {
tokenize,
};

View File

@ -0,0 +1,12 @@
function isEqualId(first, second) {
return first.toLowerCase() === second.toLowerCase();
}
function includesId(list, id) {
return list.some((item) => isEqualId(item, id));
}
module.exports = {
isEqualId,
includesId,
};

View File

@ -0,0 +1,50 @@
const ColumnCompiler = require('../../../schema/columncompiler');
// Column Compiler
// -------
class ColumnCompiler_SQLite3 extends ColumnCompiler {
constructor() {
super(...arguments);
this.modifiers = ['nullable', 'defaultTo'];
this._addCheckModifiers();
}
// Types
// -------
enu(allowed) {
return `text check (${this.formatter.wrap(
this.args[0]
)} in ('${allowed.join("', '")}'))`;
}
_pushAlterCheckQuery(checkPredicate, constraintName) {
throw new Error(
`Alter table with to add constraints is not permitted in SQLite`
);
}
checkRegex(regexes, constraintName) {
return this._check(
`${this.formatter.wrap(
this.getColumnName()
)} REGEXP ${this.client._escapeBinding(regexes)}`,
constraintName
);
}
}
ColumnCompiler_SQLite3.prototype.json = 'json';
ColumnCompiler_SQLite3.prototype.jsonb = 'json';
ColumnCompiler_SQLite3.prototype.double =
ColumnCompiler_SQLite3.prototype.decimal =
ColumnCompiler_SQLite3.prototype.floating =
'float';
ColumnCompiler_SQLite3.prototype.timestamp = 'datetime';
// autoincrement without primary key is a syntax error in SQLite, so it's necessary
ColumnCompiler_SQLite3.prototype.increments =
ColumnCompiler_SQLite3.prototype.bigincrements =
'integer not null primary key autoincrement';
module.exports = ColumnCompiler_SQLite3;

View File

@ -0,0 +1,80 @@
// SQLite3: Column Builder & Compiler
// -------
const SchemaCompiler = require('../../../schema/compiler');
const some = require('lodash/some');
// Schema Compiler
// -------
class SchemaCompiler_SQLite3 extends SchemaCompiler {
constructor(client, builder) {
super(client, builder);
}
// Compile the query to determine if a table exists.
hasTable(tableName) {
const sql =
`select * from sqlite_master ` +
`where type = 'table' and name = ${this.client.parameter(
this.formatter.wrap(tableName).replace(/`/g, ''),
this.builder,
this.bindingsHolder
)}`;
this.pushQuery({ sql, output: (resp) => resp.length > 0 });
}
// Compile the query to determine if a column exists.
hasColumn(tableName, column) {
this.pushQuery({
sql: `PRAGMA table_info(${this.formatter.wrap(tableName)})`,
output(resp) {
return some(resp, (col) => {
return (
this.client.wrapIdentifier(col.name.toLowerCase()) ===
this.client.wrapIdentifier(column.toLowerCase())
);
});
},
});
}
// Compile a rename table command.
renameTable(from, to) {
this.pushQuery(
`alter table ${this.formatter.wrap(from)} rename to ${this.formatter.wrap(
to
)}`
);
}
async generateDdlCommands() {
const sequence = this.builder._sequence;
for (let i = 0, l = sequence.length; i < l; i++) {
const query = sequence[i];
this[query.method].apply(this, query.args);
}
const commandSources = this.sequence;
if (commandSources.length === 1 && commandSources[0].statementsProducer) {
return commandSources[0].statementsProducer();
} else {
const result = [];
for (const commandSource of commandSources) {
const command = commandSource.sql;
if (Array.isArray(command)) {
result.push(...command);
} else {
result.push(command);
}
}
return { pre: [], sql: result, check: null, post: [] };
}
}
}
module.exports = SchemaCompiler_SQLite3;

View File

@ -0,0 +1,347 @@
const filter = require('lodash/filter');
const values = require('lodash/values');
const identity = require('lodash/identity');
const { isObject } = require('../../../util/is');
const TableCompiler = require('../../../schema/tablecompiler');
const { formatDefault } = require('../../../formatter/formatterUtils');
class TableCompiler_SQLite3 extends TableCompiler {
constructor() {
super(...arguments);
}
// Create a new table.
createQuery(columns, ifNot, like) {
const createStatement = ifNot
? 'create table if not exists '
: 'create table ';
let sql = createStatement + this.tableName();
if (like && this.tableNameLike()) {
sql += ' as select * from ' + this.tableNameLike() + ' where 0=1';
} else {
// so we will need to check for a primary key commands and add the columns
// to the table's declaration here so they can be created on the tables.
sql += ' (' + columns.sql.join(', ');
sql += this.foreignKeys() || '';
sql += this.primaryKeys() || '';
sql += this._addChecks();
sql += ')';
}
this.pushQuery(sql);
if (like) {
this.addColumns(columns, this.addColumnsPrefix);
}
}
addColumns(columns, prefix, colCompilers) {
if (prefix === this.alterColumnsPrefix) {
const compiler = this;
const columnsInfo = colCompilers.map((col) => {
const name = this.client.customWrapIdentifier(
col.getColumnName(),
identity,
col.columnBuilder.queryContext()
);
const type = col.getColumnType();
const defaultTo = col.modified['defaultTo']
? formatDefault(col.modified['defaultTo'][0], col.type, this.client)
: null;
const notNull =
col.modified['nullable'] && col.modified['nullable'][0] === false;
return { name, type, defaultTo, notNull };
});
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
statementsProducer(pragma, connection) {
return compiler.client
.ddl(compiler, pragma, connection)
.alterColumn(columnsInfo);
},
});
} else {
for (let i = 0, l = columns.sql.length; i < l; i++) {
this.pushQuery({
sql: `alter table ${this.tableName()} add column ${columns.sql[i]}`,
bindings: columns.bindings[i],
});
}
}
}
// Compile a drop unique key command.
dropUnique(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
this.pushQuery(`drop index ${indexName}`);
}
// Compile a drop foreign key command.
dropForeign(columns, indexName) {
const compiler = this;
columns = Array.isArray(columns) ? columns : [columns];
columns = columns.map((column) =>
this.client.customWrapIdentifier(column, identity)
);
indexName = this.client.customWrapIdentifier(indexName, identity);
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
output(pragma) {
return compiler.client
.ddl(compiler, pragma, this.connection)
.dropForeign(columns, indexName);
},
});
}
// Compile a drop primary key command.
dropPrimary(constraintName) {
const compiler = this;
constraintName = this.client.customWrapIdentifier(constraintName, identity);
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
output(pragma) {
return compiler.client
.ddl(compiler, pragma, this.connection)
.dropPrimary(constraintName);
},
});
}
dropIndex(columns, indexName) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
this.pushQuery(`drop index ${indexName}`);
}
// Compile a unique key command.
unique(columns, indexName) {
let deferrable;
let predicate;
if (isObject(indexName)) {
({ indexName, deferrable, predicate } = indexName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`sqlite3: unique index \`${indexName}\` will not be deferrable ${deferrable} because sqlite3 does not support deferred constraints.`
);
}
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('unique', this.tableNameRaw, columns);
columns = this.formatter.columnize(columns);
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: '';
this.pushQuery(
`create unique index ${indexName} on ${this.tableName()} (${columns})${predicateQuery}`
);
}
// Compile a plain index key command.
index(columns, indexName, options) {
indexName = indexName
? this.formatter.wrap(indexName)
: this._indexCommand('index', this.tableNameRaw, columns);
columns = this.formatter.columnize(columns);
let predicate;
if (isObject(options)) {
({ predicate } = options);
}
const predicateQuery = predicate
? ' ' + this.client.queryCompiler(predicate).where()
: '';
this.pushQuery(
`create index ${indexName} on ${this.tableName()} (${columns})${predicateQuery}`
);
}
/**
* Add a primary key to an existing table.
*
* @NOTE The `createQuery` method above handles table creation. Don't do anything regarding table
* creation in this method
*
* @param {string | string[]} columns - Column name(s) to assign as primary keys
* @param {string} [constraintName] - Custom name for the PK constraint
*/
primary(columns, constraintName) {
const compiler = this;
columns = Array.isArray(columns) ? columns : [columns];
columns = columns.map((column) =>
this.client.customWrapIdentifier(column, identity)
);
let deferrable;
if (isObject(constraintName)) {
({ constraintName, deferrable } = constraintName);
}
if (deferrable && deferrable !== 'not deferrable') {
this.client.logger.warn(
`sqlite3: primary key constraint \`${constraintName}\` will not be deferrable ${deferrable} because sqlite3 does not support deferred constraints.`
);
}
constraintName = this.client.customWrapIdentifier(constraintName, identity);
if (this.method !== 'create' && this.method !== 'createIfNot') {
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
output(pragma) {
return compiler.client
.ddl(compiler, pragma, this.connection)
.primary(columns, constraintName);
},
});
}
}
/**
* Add a foreign key constraint to an existing table
*
* @NOTE The `createQuery` method above handles foreign key constraints on table creation. Don't do
* anything regarding table creation in this method
*
* @param {object} foreignInfo - Information about the current column foreign setup
* @param {string | string[]} [foreignInfo.column] - Column in the current constraint
* @param {string | undefined} foreignInfo.keyName - Name of the foreign key constraint
* @param {string | string[]} foreignInfo.references - What column it references in the other table
* @param {string} foreignInfo.inTable - What table is referenced in this constraint
* @param {string} [foreignInfo.onUpdate] - What to do on updates
* @param {string} [foreignInfo.onDelete] - What to do on deletions
*/
foreign(foreignInfo) {
const compiler = this;
if (this.method !== 'create' && this.method !== 'createIfNot') {
foreignInfo.column = Array.isArray(foreignInfo.column)
? foreignInfo.column
: [foreignInfo.column];
foreignInfo.column = foreignInfo.column.map((column) =>
this.client.customWrapIdentifier(column, identity)
);
foreignInfo.inTable = this.client.customWrapIdentifier(
foreignInfo.inTable,
identity
);
foreignInfo.references = Array.isArray(foreignInfo.references)
? foreignInfo.references
: [foreignInfo.references];
foreignInfo.references = foreignInfo.references.map((column) =>
this.client.customWrapIdentifier(column, identity)
);
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
statementsProducer(pragma, connection) {
return compiler.client
.ddl(compiler, pragma, connection)
.foreign(foreignInfo);
},
});
}
}
primaryKeys() {
const pks = filter(this.grouped.alterTable || [], { method: 'primary' });
if (pks.length > 0 && pks[0].args.length > 0) {
const columns = pks[0].args[0];
let constraintName = pks[0].args[1] || '';
if (constraintName) {
constraintName = ' constraint ' + this.formatter.wrap(constraintName);
}
const needUniqueCols =
this.grouped.columns.filter((t) => t.builder._type === 'increments')
.length > 0;
// SQLite dont support autoincrement columns and composite primary keys (autoincrement is always primary key).
// You need to add unique index instead when you have autoincrement columns (https://stackoverflow.com/a/6154876/1535159)
return `,${constraintName} ${
needUniqueCols ? 'unique' : 'primary key'
} (${this.formatter.columnize(columns)})`;
}
}
foreignKeys() {
let sql = '';
const foreignKeys = filter(this.grouped.alterTable || [], {
method: 'foreign',
});
for (let i = 0, l = foreignKeys.length; i < l; i++) {
const foreign = foreignKeys[i].args[0];
const column = this.formatter.columnize(foreign.column);
const references = this.formatter.columnize(foreign.references);
const foreignTable = this.formatter.wrap(foreign.inTable);
let constraintName = foreign.keyName || '';
if (constraintName) {
constraintName = ' constraint ' + this.formatter.wrap(constraintName);
}
sql += `,${constraintName} foreign key(${column}) references ${foreignTable}(${references})`;
if (foreign.onDelete) sql += ` on delete ${foreign.onDelete}`;
if (foreign.onUpdate) sql += ` on update ${foreign.onUpdate}`;
}
return sql;
}
createTableBlock() {
return this.getColumns().concat().join(',');
}
renameColumn(from, to) {
this.pushQuery({
sql: `alter table ${this.tableName()} rename ${this.formatter.wrap(
from
)} to ${this.formatter.wrap(to)}`,
});
}
_setNullableState(column, isNullable) {
const compiler = this;
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
statementsProducer(pragma, connection) {
return compiler.client
.ddl(compiler, pragma, connection)
.setNullable(column, isNullable);
},
});
}
dropColumn() {
const compiler = this;
const columns = values(arguments);
const columnsWrapped = columns.map((column) =>
this.client.customWrapIdentifier(column, identity)
);
this.pushQuery({
sql: `PRAGMA table_info(${this.tableName()})`,
output(pragma) {
return compiler.client
.ddl(compiler, pragma, this.connection)
.dropColumn(columnsWrapped);
},
});
}
}
module.exports = TableCompiler_SQLite3;

View File

@ -0,0 +1,40 @@
/* eslint max-len: 0 */
const ViewCompiler = require('../../../schema/viewcompiler.js');
const {
columnize: columnize_,
} = require('../../../formatter/wrappingFormatter');
class ViewCompiler_SQLite3 extends ViewCompiler {
constructor(client, viewCompiler) {
super(client, viewCompiler);
}
createOrReplace() {
const columns = this.columns;
const selectQuery = this.selectQuery.toString();
const viewName = this.viewName();
const columnList = columns
? ' (' +
columnize_(
columns,
this.viewBuilder,
this.client,
this.bindingsHolder
) +
')'
: '';
const dropSql = `drop view if exists ${viewName}`;
const createSql = `create view ${viewName}${columnList} as ${selectQuery}`;
this.pushQuery({
sql: dropSql,
});
this.pushQuery({
sql: createSql,
});
}
}
module.exports = ViewCompiler_SQLite3;

View File

@ -0,0 +1,51 @@
const chunk = require('lodash/chunk');
const flatten = require('lodash/flatten');
const delay = require('./internal/delay');
const { isNumber } = require('../util/is');
function batchInsert(client, tableName, batch, chunkSize = 1000) {
let returning = undefined;
let transaction = null;
if (!isNumber(chunkSize) || chunkSize < 1) {
throw new TypeError(`Invalid chunkSize: ${chunkSize}`);
}
if (!Array.isArray(batch)) {
throw new TypeError(`Invalid batch: Expected array, got ${typeof batch}`);
}
const chunks = chunk(batch, chunkSize);
const runInTransaction = (cb) => {
if (transaction) {
return cb(transaction);
}
return client.transaction(cb);
};
return Object.assign(
Promise.resolve().then(async () => {
//Next tick to ensure wrapper functions are called if needed
await delay(1);
return runInTransaction(async (tr) => {
const chunksResults = [];
for (const items of chunks) {
chunksResults.push(await tr(tableName).insert(items, returning));
}
return flatten(chunksResults);
});
}),
{
returning(columns) {
returning = columns;
return this;
},
transacting(tr) {
transaction = tr;
return this;
},
}
);
}
module.exports = batchInsert;

View File

@ -0,0 +1,6 @@
/**
* @param {number} delay
* @returns {Promise<void>}
*/
module.exports = (delay) =>
new Promise((resolve) => setTimeout(resolve, delay));

View File

@ -0,0 +1,41 @@
function ensureConnectionCallback(runner) {
runner.client.emit('start', runner.builder);
runner.builder.emit('start', runner.builder);
const sql = runner.builder.toSQL();
if (runner.builder._debug) {
runner.client.logger.debug(sql);
}
if (Array.isArray(sql)) {
return runner.queryArray(sql);
}
return runner.query(sql);
}
function ensureConnectionStreamCallback(runner, params) {
try {
const sql = runner.builder.toSQL();
if (Array.isArray(sql) && params.hasHandler) {
throw new Error(
'The stream may only be used with a single query statement.'
);
}
return runner.client.stream(
runner.connection,
sql,
params.stream,
params.options
);
} catch (e) {
params.stream.emit('error', e);
throw e;
}
}
module.exports = {
ensureConnectionCallback,
ensureConnectionStreamCallback,
};

View File

@ -0,0 +1,62 @@
const _debugQuery = require('debug')('knex:query');
const debugBindings = require('debug')('knex:bindings');
const debugQuery = (sql, txId) => _debugQuery(sql.replace(/%/g, '%%'), txId);
const { isString } = require('../../util/is');
function formatQuery(sql, bindings, timeZone, client) {
bindings = bindings == null ? [] : [].concat(bindings);
let index = 0;
return sql.replace(/\\?\?/g, (match) => {
if (match === '\\?') {
return '?';
}
if (index === bindings.length) {
return match;
}
const value = bindings[index++];
return client._escapeBinding(value, { timeZone });
});
}
function enrichQueryObject(connection, queryParam, client) {
const queryObject = isString(queryParam) ? { sql: queryParam } : queryParam;
queryObject.bindings = client.prepBindings(queryObject.bindings);
queryObject.sql = client.positionBindings(queryObject.sql);
const { __knexUid, __knexTxId } = connection;
client.emit('query', Object.assign({ __knexUid, __knexTxId }, queryObject));
debugQuery(queryObject.sql, __knexTxId);
debugBindings(queryObject.bindings, __knexTxId);
return queryObject;
}
function executeQuery(connection, queryObject, client) {
return client._query(connection, queryObject).catch((err) => {
if (client.config && client.config.compileSqlOnError === false) {
err.message = queryObject.sql + ' - ' + err.message;
} else {
err.message =
formatQuery(queryObject.sql, queryObject.bindings, undefined, client) +
' - ' +
err.message;
}
client.emit(
'query-error',
err,
Object.assign(
{ __knexUid: connection.__knexUid, __knexTxId: connection.__knexUid },
queryObject
)
);
throw err;
});
}
module.exports = {
enrichQueryObject,
executeQuery,
formatQuery,
};

View File

@ -0,0 +1,325 @@
const { KnexTimeoutError } = require('../util/timeout');
const { timeout } = require('../util/timeout');
const {
ensureConnectionCallback,
ensureConnectionStreamCallback,
} = require('./internal/ensure-connection-callback');
let Transform;
// The "Runner" constructor takes a "builder" (query, schema, or raw)
// and runs through each of the query statements, calling any additional
// "output" method provided alongside the query and bindings.
class Runner {
constructor(client, builder) {
this.client = client;
this.builder = builder;
this.queries = [];
// The "connection" object is set on the runner when
// "run" is called.
this.connection = undefined;
}
// "Run" the target, calling "toSQL" on the builder, returning
// an object or array of queries to run, each of which are run on
// a single connection.
async run() {
const runner = this;
try {
const res = await this.ensureConnection(ensureConnectionCallback);
// Fire a single "end" event on the builder when
// all queries have successfully completed.
runner.builder.emit('end');
return res;
// If there are any "error" listeners, we fire an error event
// and then re-throw the error to be eventually handled by
// the promise chain. Useful if you're wrapping in a custom `Promise`.
} catch (err) {
if (runner.builder._events && runner.builder._events.error) {
runner.builder.emit('error', err);
}
throw err;
}
}
// Stream the result set, by passing through to the dialect's streaming
// capabilities. If the options are
stream(optionsOrHandler, handlerOrNil) {
const firstOptionIsHandler =
typeof optionsOrHandler === 'function' && arguments.length === 1;
const options = firstOptionIsHandler ? {} : optionsOrHandler;
const handler = firstOptionIsHandler ? optionsOrHandler : handlerOrNil;
// Determines whether we emit an error or throw here.
const hasHandler = typeof handler === 'function';
// Lazy-load the "Transform" dependency.
Transform = Transform || require('stream').Transform;
const queryContext = this.builder.queryContext();
const stream = new Transform({
objectMode: true,
transform: (chunk, _, callback) => {
callback(null, this.client.postProcessResponse(chunk, queryContext));
},
});
stream.on('close', () => {
this.client.releaseConnection(this.connection);
});
// If the stream is manually destroyed, the close event is not
// propagated to the top of the pipe chain. We need to manually verify
// that the source stream is closed and if not, manually destroy it.
stream.on('pipe', (sourceStream) => {
const cleanSourceStream = () => {
if (!sourceStream.closed) {
sourceStream.destroy();
}
};
// Stream already closed, cleanup immediately
if (stream.closed) {
cleanSourceStream();
} else {
stream.on('close', cleanSourceStream);
}
});
const connectionAcquirePromise = this.ensureConnection(
ensureConnectionStreamCallback,
{
options,
hasHandler,
stream,
}
)
// Emit errors on the stream if the error occurred before a connection
// could be acquired.
// If the connection was acquired, assume the error occurred in the client
// code and has already been emitted on the stream. Don't emit it twice.
.catch((err) => {
if (!this.connection) {
stream.emit('error', err);
}
});
// If a function is passed to handle the stream, send the stream
// there and return the promise, otherwise just return the stream
// and the promise will take care of itself.
if (hasHandler) {
handler(stream);
return connectionAcquirePromise;
}
return stream;
}
// Allow you to pipe the stream to a writable stream.
pipe(writable, options) {
return this.stream(options).pipe(writable);
}
// "Runs" a query, returning a promise. All queries specified by the builder are guaranteed
// to run in sequence, and on the same connection, especially helpful when schema building
// and dealing with foreign key constraints, etc.
async query(obj) {
const { __knexUid, __knexTxId } = this.connection;
this.builder.emit('query', Object.assign({ __knexUid, __knexTxId }, obj));
const runner = this;
const queryContext = this.builder.queryContext();
// query-error events are emitted before the queryPromise continuations.
// pass queryContext into client.query so it can be raised properly.
if (obj !== null && typeof obj === 'object') {
obj.queryContext = queryContext;
}
let queryPromise = this.client.query(this.connection, obj);
if (obj.timeout) {
queryPromise = timeout(queryPromise, obj.timeout);
}
// Await the return value of client.processResponse; in the case of sqlite3's
// dropColumn()/renameColumn(), it will be a Promise for the transaction
// containing the complete rename procedure.
return queryPromise
.then((resp) => this.client.processResponse(resp, runner))
.then((processedResponse) => {
const postProcessedResponse = this.client.postProcessResponse(
processedResponse,
queryContext
);
this.builder.emit(
'query-response',
postProcessedResponse,
Object.assign({ __knexUid, __knexTxId }, obj),
this.builder
);
this.client.emit(
'query-response',
postProcessedResponse,
Object.assign({ __knexUid, __knexTxId }, obj),
this.builder
);
return postProcessedResponse;
})
.catch((error) => {
if (!(error instanceof KnexTimeoutError)) {
return Promise.reject(error);
}
const { timeout, sql, bindings } = obj;
let cancelQuery;
if (obj.cancelOnTimeout) {
cancelQuery = this.client.cancelQuery(this.connection);
} else {
// If we don't cancel the query, we need to mark the connection as disposed so that
// it gets destroyed by the pool and is never used again. If we don't do this and
// return the connection to the pool, it will be useless until the current operation
// that timed out, finally finishes.
this.connection.__knex__disposed = error;
cancelQuery = Promise.resolve();
}
return cancelQuery
.catch((cancelError) => {
// If the cancellation failed, we need to mark the connection as disposed so that
// it gets destroyed by the pool and is never used again. If we don't do this and
// return the connection to the pool, it will be useless until the current operation
// that timed out, finally finishes.
this.connection.__knex__disposed = error;
// cancellation failed
throw Object.assign(cancelError, {
message: `After query timeout of ${timeout}ms exceeded, cancelling of query failed.`,
sql,
bindings,
timeout,
});
})
.then(() => {
// cancellation succeeded, rethrow timeout error
throw Object.assign(error, {
message: `Defined query timeout of ${timeout}ms exceeded when running query.`,
sql,
bindings,
timeout,
});
});
})
.catch((error) => {
this.builder.emit(
'query-error',
error,
Object.assign({ __knexUid, __knexTxId, queryContext }, obj)
);
throw error;
});
}
// In the case of the "schema builder" we call `queryArray`, which runs each
// of the queries in sequence.
async queryArray(queries) {
if (queries.length === 1) {
const query = queries[0];
if (!query.statementsProducer) {
return this.query(query);
}
const statements = await query.statementsProducer(
undefined,
this.connection
);
const sqlQueryObjects = statements.sql.map((statement) => ({
sql: statement,
bindings: query.bindings,
}));
const preQueryObjects = statements.pre.map((statement) => ({
sql: statement,
bindings: query.bindings,
}));
const postQueryObjects = statements.post.map((statement) => ({
sql: statement,
bindings: query.bindings,
}));
let results = [];
await this.queryArray(preQueryObjects);
try {
await this.client.transaction(
async (trx) => {
const transactionRunner = new Runner(trx.client, this.builder);
transactionRunner.connection = this.connection;
results = await transactionRunner.queryArray(sqlQueryObjects);
if (statements.check) {
const foreignViolations = await trx.raw(statements.check);
if (foreignViolations.length > 0) {
throw new Error('FOREIGN KEY constraint failed');
}
}
},
{ connection: this.connection }
);
} finally {
await this.queryArray(postQueryObjects);
}
return results;
}
const results = [];
for (const query of queries) {
results.push(await this.queryArray([query]));
}
return results;
}
// Check whether there's a transaction flag, and that it has a connection.
async ensureConnection(cb, cbParams) {
// Use override from a builder if passed
if (this.builder._connection) {
this.connection = this.builder._connection;
}
if (this.connection) {
return cb(this, cbParams);
}
let acquiredConnection;
try {
acquiredConnection = await this.client.acquireConnection();
} catch (error) {
if (!(error instanceof KnexTimeoutError)) {
return Promise.reject(error);
}
if (this.builder) {
error.sql = this.builder.sql;
error.bindings = this.builder.bindings;
}
throw error;
}
try {
this.connection = acquiredConnection;
return await cb(this, cbParams);
} finally {
await this.client.releaseConnection(acquiredConnection);
}
}
}
module.exports = Runner;

View File

@ -0,0 +1,413 @@
// Transaction
// -------
const { EventEmitter } = require('events');
const Debug = require('debug');
const uniqueId = require('lodash/uniqueId');
const { callbackify } = require('util');
const makeKnex = require('../knex-builder/make-knex');
const { timeout, KnexTimeoutError } = require('../util/timeout');
const finallyMixin = require('../util/finally-mixin');
const debug = Debug('knex:tx');
// FYI: This is defined as a function instead of a constant so that
// each Transactor can have its own copy of the default config.
// This will minimize the impact of bugs that might be introduced
// if a Transactor ever mutates its config.
function DEFAULT_CONFIG() {
return {
userParams: {},
doNotRejectOnRollback: true,
};
}
// These aren't supported in sqlite3 which is serialized already so it's as
// safe as reasonable, except for a special read_uncommitted pragma
const validIsolationLevels = [
// Doesn't really work in postgres, it treats it as read committed
'read uncommitted',
'read committed',
'snapshot',
// snapshot and repeatable read are basically the same, most "repeatable
// read" implementations are actually "snapshot" also known as Multi Version
// Concurrency Control (MVCC). Mssql's repeatable read doesn't stop
// repeated reads for inserts as it uses a pessimistic locking system so
// you should probably use 'snapshot' to stop read skew.
'repeatable read',
// mysql pretends to have serializable, but it is not
'serializable',
];
// Acts as a facade for a Promise, keeping the internal state
// and managing any child transactions.
class Transaction extends EventEmitter {
constructor(client, container, config = DEFAULT_CONFIG(), outerTx = null) {
super();
this.userParams = config.userParams;
this.doNotRejectOnRollback = config.doNotRejectOnRollback;
const txid = (this.txid = uniqueId('trx'));
this.client = client;
this.logger = client.logger;
this.outerTx = outerTx;
this.trxClient = undefined;
this._completed = false;
this._debug = client.config && client.config.debug;
this.readOnly = config.readOnly;
if (config.isolationLevel) {
this.setIsolationLevel(config.isolationLevel);
}
debug(
'%s: Starting %s transaction',
txid,
outerTx ? 'nested' : 'top level'
);
// `this` can potentially serve as an `outerTx` for another
// Transaction. So, go ahead and establish `_lastChild` now.
this._lastChild = Promise.resolve();
const _previousSibling = outerTx ? outerTx._lastChild : Promise.resolve();
// FYI: As you will see in a moment, this Promise will be used to construct
// 2 separate Promise Chains. This ensures that each Promise Chain
// can establish its error-handling semantics without interfering
// with the other Promise Chain.
const basePromise = _previousSibling.then(() =>
this._evaluateContainer(config, container)
);
// FYI: This is the Promise Chain for EXTERNAL use. It ensures that the
// caller must handle any exceptions that result from `basePromise`.
this._promise = basePromise.then((x) => x);
if (outerTx) {
// FYI: This is the Promise Chain for INTERNAL use. It serves as a signal
// for when the next sibling should begin its execution. Therefore,
// exceptions are caught and ignored.
outerTx._lastChild = basePromise.catch(() => {});
}
}
isCompleted() {
return (
this._completed || (this.outerTx && this.outerTx.isCompleted()) || false
);
}
begin(conn) {
const trxMode = [
this.isolationLevel ? `ISOLATION LEVEL ${this.isolationLevel}` : '',
this.readOnly ? 'READ ONLY' : '',
]
.join(' ')
.trim();
if (trxMode.length === 0) {
return this.query(conn, 'BEGIN;');
}
return this.query(conn, `SET TRANSACTION ${trxMode};`).then(() =>
this.query(conn, 'BEGIN;')
);
}
savepoint(conn) {
return this.query(conn, `SAVEPOINT ${this.txid};`);
}
commit(conn, value) {
return this.query(conn, 'COMMIT;', 1, value);
}
release(conn, value) {
return this.query(conn, `RELEASE SAVEPOINT ${this.txid};`, 1, value);
}
setIsolationLevel(isolationLevel) {
if (!validIsolationLevels.includes(isolationLevel)) {
throw new Error(
`Invalid isolationLevel, supported isolation levels are: ${JSON.stringify(
validIsolationLevels
)}`
);
}
this.isolationLevel = isolationLevel;
return this;
}
rollback(conn, error) {
return timeout(this.query(conn, 'ROLLBACK', 2, error), 5000).catch(
(err) => {
if (!(err instanceof KnexTimeoutError)) {
return Promise.reject(err);
}
this._rejecter(error);
}
);
}
rollbackTo(conn, error) {
return timeout(
this.query(conn, `ROLLBACK TO SAVEPOINT ${this.txid}`, 2, error),
5000
).catch((err) => {
if (!(err instanceof KnexTimeoutError)) {
return Promise.reject(err);
}
this._rejecter(error);
});
}
query(conn, sql, status, value) {
const q = this.trxClient
.query(conn, sql)
.catch((err) => {
status = 2;
value = err;
this._completed = true;
debug('%s error running transaction query', this.txid);
})
.then((res) => {
if (status === 1) {
this._resolver(value);
}
if (status === 2) {
if (value === undefined) {
if (this.doNotRejectOnRollback && /^ROLLBACK\b/i.test(sql)) {
this._resolver();
return;
}
value = new Error(`Transaction rejected with non-error: ${value}`);
}
this._rejecter(value);
}
return res;
});
if (status === 1 || status === 2) {
this._completed = true;
}
return q;
}
debug(enabled) {
this._debug = arguments.length ? enabled : true;
return this;
}
async _evaluateContainer(config, container) {
return this.acquireConnection(config, (connection) => {
const trxClient = (this.trxClient = makeTxClient(
this,
this.client,
connection
));
const init = this.client.transacting
? this.savepoint(connection)
: this.begin(connection);
const executionPromise = new Promise((resolver, rejecter) => {
this._resolver = resolver;
this._rejecter = rejecter;
});
init
.then(() => {
return makeTransactor(this, connection, trxClient);
})
.then((transactor) => {
this.transactor = transactor;
if (this.outerTx) {
transactor.parentTransaction = this.outerTx.transactor;
}
transactor.executionPromise = executionPromise;
// If we've returned a "thenable" from the transaction container, assume
// the rollback and commit are chained to this object's success / failure.
// Directly thrown errors are treated as automatic rollbacks.
let result;
try {
result = container(transactor);
} catch (err) {
result = Promise.reject(err);
}
if (result && result.then && typeof result.then === 'function') {
result
.then((val) => {
return transactor.commit(val);
})
.catch((err) => {
return transactor.rollback(err);
});
}
return null;
})
.catch((e) => {
return this._rejecter(e);
});
return executionPromise;
});
}
// Acquire a connection and create a disposer - either using the one passed
// via config or getting one off the client. The disposer will be called once
// the original promise is marked completed.
async acquireConnection(config, cb) {
const configConnection = config && config.connection;
const connection =
configConnection || (await this.client.acquireConnection());
try {
connection.__knexTxId = this.txid;
return await cb(connection);
} finally {
if (!configConnection) {
debug('%s: releasing connection', this.txid);
this.client.releaseConnection(connection);
} else {
debug('%s: not releasing external connection', this.txid);
}
}
}
then(onResolve, onReject) {
return this._promise.then(onResolve, onReject);
}
catch(...args) {
return this._promise.catch(...args);
}
asCallback(cb) {
callbackify(() => this._promise)(cb);
return this._promise;
}
}
finallyMixin(Transaction.prototype);
// The transactor is a full featured knex object, with a "commit", a "rollback"
// and a "savepoint" function. The "savepoint" is just sugar for creating a new
// transaction. If the rollback is run inside a savepoint, it rolls back to the
// last savepoint - otherwise it rolls back the transaction.
function makeTransactor(trx, connection, trxClient) {
const transactor = makeKnex(trxClient);
transactor.context.withUserParams = () => {
throw new Error(
'Cannot set user params on a transaction - it can only inherit params from main knex instance'
);
};
transactor.isTransaction = true;
transactor.userParams = trx.userParams || {};
transactor.context.transaction = function (container, options) {
if (!options) {
options = { doNotRejectOnRollback: true };
} else if (options.doNotRejectOnRollback === undefined) {
options.doNotRejectOnRollback = true;
}
return this._transaction(container, options, trx);
};
transactor.savepoint = function (container, options) {
return transactor.transaction(container, options);
};
if (trx.client.transacting) {
transactor.commit = (value) => trx.release(connection, value);
transactor.rollback = (error) => trx.rollbackTo(connection, error);
} else {
transactor.commit = (value) => trx.commit(connection, value);
transactor.rollback = (error) => trx.rollback(connection, error);
}
transactor.isCompleted = () => trx.isCompleted();
return transactor;
}
// We need to make a client object which always acquires the same
// connection and does not release back into the pool.
function makeTxClient(trx, client, connection) {
const trxClient = Object.create(client.constructor.prototype);
trxClient.version = client.version;
trxClient.config = client.config;
trxClient.driver = client.driver;
trxClient.connectionSettings = client.connectionSettings;
trxClient.transacting = true;
trxClient.valueForUndefined = client.valueForUndefined;
trxClient.logger = client.logger;
trxClient.on('start', function (arg) {
trx.emit('start', arg);
client.emit('start', arg);
});
trxClient.on('query', function (arg) {
trx.emit('query', arg);
client.emit('query', arg);
});
trxClient.on('query-error', function (err, obj) {
trx.emit('query-error', err, obj);
client.emit('query-error', err, obj);
});
trxClient.on('query-response', function (response, obj, builder) {
trx.emit('query-response', response, obj, builder);
client.emit('query-response', response, obj, builder);
});
const _query = trxClient.query;
trxClient.query = function (conn, obj) {
const completed = trx.isCompleted();
return new Promise(function (resolve, reject) {
try {
if (conn !== connection)
throw new Error('Invalid connection for transaction query.');
if (completed) completedError(trx, obj);
resolve(_query.call(trxClient, conn, obj));
} catch (e) {
reject(e);
}
});
};
const _stream = trxClient.stream;
trxClient.stream = function (conn, obj, stream, options) {
const completed = trx.isCompleted();
return new Promise(function (resolve, reject) {
try {
if (conn !== connection)
throw new Error('Invalid connection for transaction query.');
if (completed) completedError(trx, obj);
resolve(_stream.call(trxClient, conn, obj, stream, options));
} catch (e) {
reject(e);
}
});
};
trxClient.acquireConnection = function () {
return Promise.resolve(connection);
};
trxClient.releaseConnection = function () {
return Promise.resolve();
};
return trxClient;
}
function completedError(trx, obj) {
const sql = typeof obj === 'string' ? obj : obj && obj.sql;
debug('%s: Transaction completed: %s', trx.txid, sql);
throw new Error(
'Transaction query already complete, run with DEBUG=knex:tx for more info'
);
}
module.exports = Transaction;

25
backend/apis/nodejs/node_modules/knex/lib/formatter.js generated vendored Normal file
View File

@ -0,0 +1,25 @@
const {
columnize: columnize_,
wrap: wrap_,
} = require('./formatter/wrappingFormatter');
class Formatter {
constructor(client, builder) {
this.client = client;
this.builder = builder;
this.bindings = [];
}
// Accepts a string or array of columns to wrap as appropriate.
columnize(target) {
return columnize_(target, this.builder, this.client, this);
}
// Puts the appropriate wrapper around a value depending on the database
// engine, unless it's a knex.raw value, in which case it's left alone.
wrap(value, isParameter) {
return wrap_(value, isParameter, this.builder, this.client, this);
}
}
module.exports = Formatter;

View File

@ -0,0 +1,42 @@
const { isObject } = require('../util/is');
// Compiles a callback using the query builder.
function compileCallback(callback, method, client, bindingsHolder) {
// Build the callback
const builder = client.queryBuilder();
callback.call(builder, builder);
// Compile the callback, using the current formatter (to track all bindings).
const compiler = client.queryCompiler(builder, bindingsHolder.bindings);
// Return the compiled & parameterized sql.
return compiler.toSQL(method || builder._method || 'select');
}
function wrapAsIdentifier(value, builder, client) {
const queryContext = builder.queryContext();
return client.wrapIdentifier((value || '').trim(), queryContext);
}
function formatDefault(value, type, client) {
if (value === void 0) {
return '';
} else if (value === null) {
return 'null';
} else if (value && value.isRawInstance) {
return value.toQuery();
} else if (type === 'bool') {
if (value === 'false') value = 0;
return `'${value ? 1 : 0}'`;
} else if ((type === 'json' || type === 'jsonb') && isObject(value)) {
return `'${JSON.stringify(value)}'`;
} else {
return client._escapeBinding(value.toString());
}
}
module.exports = {
compileCallback,
wrapAsIdentifier,
formatDefault,
};

Some files were not shown because too many files have changed in this diff Show More