merge: sync dev into 216-homepage-structure
Some checks failed
PR Fast Feedback / fast-feedback (pull_request) Failing after 47s
Some checks failed
PR Fast Feedback / fast-feedback (pull_request) Failing after 47s
This commit is contained in:
commit
c43b3a88c9
8
.github/agents/copilot-instructions.md
vendored
8
.github/agents/copilot-instructions.md
vendored
@ -212,7 +212,9 @@ ## Active Technologies
|
|||||||
- Static filesystem content, styles, assets, and content collections under `apps/website/src` and `apps/website/public`; no database (214-website-visual-foundation)
|
- Static filesystem content, styles, assets, and content collections under `apps/website/src` and `apps/website/public`; no database (214-website-visual-foundation)
|
||||||
- Markdown governance artifacts, JSON Schema plus logical OpenAPI planning contracts, and Bash-backed SpecKit scripts inside a PHP 8.4.15 / Laravel 12 / Filament v5 / Livewire v4 repository + `.specify/memory/constitution.md`, `.specify/templates/spec-template.md`, `.specify/templates/plan-template.md`, `.specify/templates/tasks-template.md`, `.specify/templates/checklist-template.md`, `.specify/README.md`, `docs/ui/operator-ux-surface-standards.md`, and Specs 196 through 200 (201-enforcement-review-guardrails)
|
- Markdown governance artifacts, JSON Schema plus logical OpenAPI planning contracts, and Bash-backed SpecKit scripts inside a PHP 8.4.15 / Laravel 12 / Filament v5 / Livewire v4 repository + `.specify/memory/constitution.md`, `.specify/templates/spec-template.md`, `.specify/templates/plan-template.md`, `.specify/templates/tasks-template.md`, `.specify/templates/checklist-template.md`, `.specify/README.md`, `docs/ui/operator-ux-surface-standards.md`, and Specs 196 through 200 (201-enforcement-review-guardrails)
|
||||||
- Repository-owned markdown and contract artifacts under `.specify/` and `/Users/ahmeddarrazi/Documents/projects/wt-plattform/specs/201-enforcement-review-guardrails/`; no product database persistence (201-enforcement-review-guardrails)
|
- Repository-owned markdown and contract artifacts under `.specify/` and `/Users/ahmeddarrazi/Documents/projects/wt-plattform/specs/201-enforcement-review-guardrails/`; no product database persistence (201-enforcement-review-guardrails)
|
||||||
- Astro 6.0.0 templates + TypeScript 5.9 stric + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro layout/primitive/content helpers, Playwright smoke tests (215-website-core-pages)
|
- PHP 8.4.15, Laravel 12, Filament v5, Livewire v4, Blade + Filament v5, Livewire v4, Pest v4, Laravel Sail, `ArtifactTruthPresenter`, `ArtifactTruthEnvelope`, `OperatorExplanationBuilder`, `BaselineSnapshotPresenter`, `BadgeCatalog`, `BadgeRenderer`, existing governance Filament resources/pages, and current Enterprise Detail builders (214-governance-outcome-compression)
|
||||||
|
- PostgreSQL via existing `baseline_snapshots`, `evidence_snapshots`, `evidence_snapshot_items`, `tenant_reviews`, `review_packs`, and `operation_runs` tables; no schema change planned (214-governance-outcome-compression)
|
||||||
|
- Astro 6.0.0 templates + TypeScript 5.9 strict + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro layout/primitive/content helpers, Playwright smoke tests (215-website-core-pages)
|
||||||
- Static filesystem pages, content modules, and Astro content collections under `apps/website/src` and `apps/website/public`; no database (215-website-core-pages)
|
- Static filesystem pages, content modules, and Astro content collections under `apps/website/src` and `apps/website/public`; no database (215-website-core-pages)
|
||||||
- Astro 6.0.0 templates + TypeScript 5.9.x + Astro 6, Tailwind CSS v4, local Astro layout/section primitives, Astro content collections, Playwright browser smoke tests (216-homepage-structure)
|
- Astro 6.0.0 templates + TypeScript 5.9.x + Astro 6, Tailwind CSS v4, local Astro layout/section primitives, Astro content collections, Playwright browser smoke tests (216-homepage-structure)
|
||||||
- Static filesystem content, Astro content collections, and assets under `apps/website/src` and `apps/website/public`; no database (216-homepage-structure)
|
- Static filesystem content, Astro content collections, and assets under `apps/website/src` and `apps/website/public`; no database (216-homepage-structure)
|
||||||
@ -251,7 +253,9 @@ ## Code Style
|
|||||||
|
|
||||||
## Recent Changes
|
## Recent Changes
|
||||||
- 216-homepage-structure: Added Astro 6.0.0 templates + TypeScript 5.9.x + Astro 6, Tailwind CSS v4, local Astro layout/section primitives, Astro content collections, Playwright browser smoke tests
|
- 216-homepage-structure: Added Astro 6.0.0 templates + TypeScript 5.9.x + Astro 6, Tailwind CSS v4, local Astro layout/section primitives, Astro content collections, Playwright browser smoke tests
|
||||||
- 215-website-core-pages: Added Astro 6.0.0 templates + TypeScript 5.9 stric + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro layout/primitive/content helpers, Playwright smoke tests
|
- 215-website-core-pages: Added Astro 6.0.0 templates + TypeScript 5.9 strict + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro layout/primitive/content helpers, Playwright smoke tests
|
||||||
|
- 214-governance-outcome-compression: Added PHP 8.4.15, Laravel 12, Filament v5, Livewire v4, Blade + Filament v5, Livewire v4, Pest v4, Laravel Sail, `ArtifactTruthPresenter`, `ArtifactTruthEnvelope`, `OperatorExplanationBuilder`, `BaselineSnapshotPresenter`, `BadgeCatalog`, `BadgeRenderer`, existing governance Filament resources/pages, and current Enterprise Detail builders
|
||||||
|
- 213-website-foundation-v0: Added Astro 6.0.0 templates + TypeScript 5.x (explicit setup in `apps/website`) + Astro 6, Tailwind CSS v4, custom Astro component primitives (shadcn-inspired), lightweight Playwright browser smoke tests
|
||||||
- 214-website-visual-foundation: Added Astro 6.0.0 templates + TypeScript 5.9 strict + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro component primitives, Playwright browser smoke tests
|
- 214-website-visual-foundation: Added Astro 6.0.0 templates + TypeScript 5.9 strict + Astro 6, Tailwind CSS v4 via `@tailwindcss/vite`, Astro content collections, local Astro component primitives, Playwright browser smoke tests
|
||||||
<!-- MANUAL ADDITIONS START -->
|
<!-- MANUAL ADDITIONS START -->
|
||||||
<!-- MANUAL ADDITIONS END -->
|
<!-- MANUAL ADDITIONS END -->
|
||||||
|
|||||||
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/pg-core/columns/json.ts"],"sourcesContent":["import type { ColumnBuilderBaseConfig, ColumnBuilderRuntimeConfig, MakeColumnConfig } from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnyPgTable } from '~/pg-core/table.ts';\nimport { PgColumn, PgColumnBuilder } from './common.ts';\n\nexport type PgJsonBuilderInitial<TName extends string> = PgJsonBuilder<{\n\tname: TName;\n\tdataType: 'json';\n\tcolumnType: 'PgJson';\n\tdata: unknown;\n\tdriverParam: unknown;\n\tenumValues: undefined;\n}>;\n\nexport class PgJsonBuilder<T extends ColumnBuilderBaseConfig<'json', 'PgJson'>> extends PgColumnBuilder<\n\tT\n> {\n\tstatic override readonly [entityKind]: string = 'PgJsonBuilder';\n\n\tconstructor(name: T['name']) {\n\t\tsuper(name, 'json', 'PgJson');\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnyPgTable<{ name: TTableName }>,\n\t): PgJson<MakeColumnConfig<T, TTableName>> {\n\t\treturn new PgJson<MakeColumnConfig<T, TTableName>>(table, this.config as ColumnBuilderRuntimeConfig<any, any>);\n\t}\n}\n\nexport class PgJson<T extends ColumnBaseConfig<'json', 'PgJson'>> extends PgColumn<T> {\n\tstatic override readonly [entityKind]: string = 'PgJson';\n\n\tconstructor(table: AnyPgTable<{ name: T['tableName'] }>, config: PgJsonBuilder<T>['config']) {\n\t\tsuper(table, config);\n\t}\n\n\tgetSQLType(): string {\n\t\treturn 'json';\n\t}\n\n\toverride mapToDriverValue(value: T['data']): string {\n\t\treturn JSON.stringify(value);\n\t}\n\n\toverride mapFromDriverValue(value: T['data'] | string): T['data'] {\n\t\tif (typeof value === 'string') {\n\t\t\ttry {\n\t\t\t\treturn JSON.parse(value);\n\t\t\t} catch {\n\t\t\t\treturn value as T['data'];\n\t\t\t}\n\t\t}\n\t\treturn value;\n\t}\n}\n\nexport function json(): PgJsonBuilderInitial<''>;\nexport function json<TName extends string>(name: TName): PgJsonBuilderInitial<TName>;\nexport function json(name?: string) {\n\treturn new PgJsonBuilder(name ?? '');\n}\n"],"mappings":"AAEA,SAAS,kBAAkB;AAE3B,SAAS,UAAU,uBAAuB;AAWnC,MAAM,sBAA2E,gBAEtF;AAAA,EACD,QAA0B,UAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB;AAC5B,UAAM,MAAM,QAAQ,QAAQ;AAAA,EAC7B;AAAA;AAAA,EAGS,MACR,OAC0C;AAC1C,WAAO,IAAI,OAAwC,OAAO,KAAK,MAA8C;AAAA,EAC9G;AACD;AAEO,MAAM,eAA6D,SAAY;AAAA,EACrF,QAA0B,UAAU,IAAY;AAAA,EAEhD,YAAY,OAA6C,QAAoC;AAC5F,UAAM,OAAO,MAAM;AAAA,EACpB;AAAA,EAEA,aAAqB;AACpB,WAAO;AAAA,EACR;AAAA,EAES,iBAAiB,OAA0B;AACnD,WAAO,KAAK,UAAU,KAAK;AAAA,EAC5B;AAAA,EAES,mBAAmB,OAAsC;AACjE,QAAI,OAAO,UAAU,UAAU;AAC9B,UAAI;AACH,eAAO,KAAK,MAAM,KAAK;AAAA,MACxB,QAAQ;AACP,eAAO;AAAA,MACR;AAAA,IACD;AACA,WAAO;AAAA,EACR;AACD;AAIO,SAAS,KAAK,MAAe;AACnC,SAAO,IAAI,cAAc,QAAQ,EAAE;AACpC;","names":[]}
|
||||||
@ -0,0 +1,147 @@
|
|||||||
|
# Prefixing
|
||||||
|
|
||||||
|
## Prefix Styles
|
||||||
|
|
||||||
|
concurrently will by default prefix each command's outputs with a zero-based index, wrapped in square brackets:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently 'echo Hello there' "echo 'General Kenobi!'"
|
||||||
|
[0] Hello there
|
||||||
|
[1] General Kenobi!
|
||||||
|
[0] echo Hello there exited with code 0
|
||||||
|
[1] echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
If you've given the commands names, they are used instead:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently --names one,two 'echo Hello there' "echo 'General Kenobi!'"
|
||||||
|
[one] Hello there
|
||||||
|
[two] General Kenobi!
|
||||||
|
[one] echo Hello there exited with code 0
|
||||||
|
[two] echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
There are other prefix styles available too:
|
||||||
|
|
||||||
|
| Style | Description |
|
||||||
|
| --------- | --------------------------------- |
|
||||||
|
| `index` | Zero-based command's index |
|
||||||
|
| `name` | The command's name |
|
||||||
|
| `command` | The command's line |
|
||||||
|
| `time` | Time of output |
|
||||||
|
| `pid` | ID of the command's process (PID) |
|
||||||
|
| `none` | No prefix |
|
||||||
|
|
||||||
|
Any of these can be used by setting the `--prefix`/`-p` flag. For example:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently --prefix pid 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
[2222] Hello there
|
||||||
|
[2223] General Kenobi!
|
||||||
|
[2222] echo Hello there exited with code 0
|
||||||
|
[2223] echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
It's also possible to have a prefix based on a template. Any of the styles listed above can be used by wrapping it in `{}`.
|
||||||
|
Doing so will also remove the square brackets:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently --prefix '{index}-{pid}' 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
0-2222 Hello there
|
||||||
|
1-2223 General Kenobi!
|
||||||
|
0-2222 echo Hello there exited with code 0
|
||||||
|
1-2223 echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
## Prefix Colors
|
||||||
|
|
||||||
|
By default, there are no colors applied to concurrently prefixes, and they just use whatever the terminal's defaults are.
|
||||||
|
|
||||||
|
This can be changed by using the `--prefix-colors`/`-c` flag, which takes a comma-separated list of colors to use.<br/>
|
||||||
|
The available values are color names (e.g. `green`, `magenta`, `gray`, etc), a hex value (such as `#23de43`), or `auto`, to automatically select a color.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently -c red,blue 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
```
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary>List of available color names</summary>
|
||||||
|
|
||||||
|
- `black`
|
||||||
|
- `blue`
|
||||||
|
- `cyan`
|
||||||
|
- `green`
|
||||||
|
- `gray`
|
||||||
|
- `magenta`
|
||||||
|
- `red`
|
||||||
|
- `white`
|
||||||
|
- `yellow`
|
||||||
|
</details>
|
||||||
|
|
||||||
|
Colors can take modifiers too. Several can be applied at once by prepending `.<modifier 1>.<modifier 2>` and so on.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently -c red,bold.blue.dim 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
```
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary>List of available modifiers</summary>
|
||||||
|
|
||||||
|
- `reset`
|
||||||
|
- `bold`
|
||||||
|
- `dim`
|
||||||
|
- `hidden`
|
||||||
|
- `inverse`
|
||||||
|
- `italic`
|
||||||
|
- `strikethrough`
|
||||||
|
- `underline`
|
||||||
|
</details>
|
||||||
|
|
||||||
|
A background color can be set in a similarly fashion.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently -c bgGray,red.bgBlack 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
```
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary>List of available background color names</summary>
|
||||||
|
|
||||||
|
- `bgBlack`
|
||||||
|
- `bgBlue`
|
||||||
|
- `bgCyan`
|
||||||
|
- `bgGreen`
|
||||||
|
- `bgGray`
|
||||||
|
- `bgMagenta`
|
||||||
|
- `bgRed`
|
||||||
|
- `bgWhite`
|
||||||
|
- `bgYellow`
|
||||||
|
</details>
|
||||||
|
|
||||||
|
## Prefix Length
|
||||||
|
|
||||||
|
When using the `command` prefix style, it's possible that it'll be too long.<br/>
|
||||||
|
It can be limited by setting the `--prefix-length`/`-l` flag:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently -p command -l 10 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
[echo..here] Hello there
|
||||||
|
[echo..bi!'] General Kenobi!
|
||||||
|
[echo..here] echo Hello there exited with code 0
|
||||||
|
[echo..bi!'] echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
It's also possible that some prefixes are too short, and you want all of them to have the same length.<br/>
|
||||||
|
This can be done by setting the `--pad-prefix` flag:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ concurrently -n foo,barbaz --pad-prefix 'echo Hello there' 'echo General Kenobi!'
|
||||||
|
[foo ] Hello there
|
||||||
|
[foo ] echo Hello there exited with code 0
|
||||||
|
[barbaz] General Kenobi!
|
||||||
|
[barbaz] echo 'General Kenobi!' exited with code 0
|
||||||
|
```
|
||||||
|
|
||||||
|
> [!NOTE]
|
||||||
|
> If using the `pid` prefix style in combination with [`--restart-tries`](./restarting.md), the length of the PID might grow, in which case all subsequent lines will match the new length.<br/>
|
||||||
|
> This might happen, for example, if the PID was 99 and it's now 100.
|
||||||
@ -0,0 +1,25 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __reExport = (target, mod, secondTarget) => (__copyProps(target, mod, "default"), secondTarget && __copyProps(secondTarget, mod, "default"));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var neon_exports = {};
|
||||||
|
module.exports = __toCommonJS(neon_exports);
|
||||||
|
__reExport(neon_exports, require("./neon-auth.cjs"), module.exports);
|
||||||
|
__reExport(neon_exports, require("./rls.cjs"), module.exports);
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
...require("./neon-auth.cjs"),
|
||||||
|
...require("./rls.cjs")
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=index.cjs.map
|
||||||
@ -0,0 +1,102 @@
|
|||||||
|
import { AjaxRequest } from './types';
|
||||||
|
import { getXHRResponse } from './getXHRResponse';
|
||||||
|
import { createErrorClass } from '../util/createErrorClass';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A normalized AJAX error.
|
||||||
|
*
|
||||||
|
* @see {@link ajax}
|
||||||
|
*/
|
||||||
|
export interface AjaxError extends Error {
|
||||||
|
/**
|
||||||
|
* The XHR instance associated with the error.
|
||||||
|
*/
|
||||||
|
xhr: XMLHttpRequest;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The AjaxRequest associated with the error.
|
||||||
|
*/
|
||||||
|
request: AjaxRequest;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The HTTP status code, if the request has completed. If not,
|
||||||
|
* it is set to `0`.
|
||||||
|
*/
|
||||||
|
status: number;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The responseType (e.g. 'json', 'arraybuffer', or 'xml').
|
||||||
|
*/
|
||||||
|
responseType: XMLHttpRequestResponseType;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The response data.
|
||||||
|
*/
|
||||||
|
response: any;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AjaxErrorCtor {
|
||||||
|
/**
|
||||||
|
* @deprecated Internal implementation detail. Do not construct error instances.
|
||||||
|
* Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269
|
||||||
|
*/
|
||||||
|
new (message: string, xhr: XMLHttpRequest, request: AjaxRequest): AjaxError;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Thrown when an error occurs during an AJAX request.
|
||||||
|
* This is only exported because it is useful for checking to see if an error
|
||||||
|
* is an `instanceof AjaxError`. DO NOT create new instances of `AjaxError` with
|
||||||
|
* the constructor.
|
||||||
|
*
|
||||||
|
* @see {@link ajax}
|
||||||
|
*/
|
||||||
|
export const AjaxError: AjaxErrorCtor = createErrorClass(
|
||||||
|
(_super) =>
|
||||||
|
function AjaxErrorImpl(this: any, message: string, xhr: XMLHttpRequest, request: AjaxRequest) {
|
||||||
|
this.message = message;
|
||||||
|
this.name = 'AjaxError';
|
||||||
|
this.xhr = xhr;
|
||||||
|
this.request = request;
|
||||||
|
this.status = xhr.status;
|
||||||
|
this.responseType = xhr.responseType;
|
||||||
|
let response: any;
|
||||||
|
try {
|
||||||
|
// This can throw in IE, because we have to do a JSON.parse of
|
||||||
|
// the response in some cases to get the expected response property.
|
||||||
|
response = getXHRResponse(xhr);
|
||||||
|
} catch (err) {
|
||||||
|
response = xhr.responseText;
|
||||||
|
}
|
||||||
|
this.response = response;
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export interface AjaxTimeoutError extends AjaxError {}
|
||||||
|
|
||||||
|
export interface AjaxTimeoutErrorCtor {
|
||||||
|
/**
|
||||||
|
* @deprecated Internal implementation detail. Do not construct error instances.
|
||||||
|
* Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269
|
||||||
|
*/
|
||||||
|
new (xhr: XMLHttpRequest, request: AjaxRequest): AjaxTimeoutError;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Thrown when an AJAX request times out. Not to be confused with {@link TimeoutError}.
|
||||||
|
*
|
||||||
|
* This is exported only because it is useful for checking to see if errors are an
|
||||||
|
* `instanceof AjaxTimeoutError`. DO NOT use the constructor to create an instance of
|
||||||
|
* this type.
|
||||||
|
*
|
||||||
|
* @see {@link ajax}
|
||||||
|
*/
|
||||||
|
export const AjaxTimeoutError: AjaxTimeoutErrorCtor = (() => {
|
||||||
|
function AjaxTimeoutErrorImpl(this: any, xhr: XMLHttpRequest, request: AjaxRequest) {
|
||||||
|
AjaxError.call(this, 'ajax timeout', xhr, request);
|
||||||
|
this.name = 'AjaxTimeoutError';
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
AjaxTimeoutErrorImpl.prototype = Object.create(AjaxError.prototype);
|
||||||
|
return AjaxTimeoutErrorImpl;
|
||||||
|
})() as any;
|
||||||
@ -0,0 +1,39 @@
|
|||||||
|
/**
|
||||||
|
Check if [`argv`](https://nodejs.org/docs/latest/api/process.html#process_process_argv) has a specific flag.
|
||||||
|
|
||||||
|
@param flag - CLI flag to look for. The `--` prefix is optional.
|
||||||
|
@param argv - CLI arguments. Default: `process.argv`.
|
||||||
|
@returns Whether the flag exists.
|
||||||
|
|
||||||
|
@example
|
||||||
|
```
|
||||||
|
// $ ts-node foo.ts -f --unicorn --foo=bar -- --rainbow
|
||||||
|
|
||||||
|
// foo.ts
|
||||||
|
import hasFlag = require('has-flag');
|
||||||
|
|
||||||
|
hasFlag('unicorn');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('--unicorn');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('f');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('-f');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('foo=bar');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('foo');
|
||||||
|
//=> false
|
||||||
|
|
||||||
|
hasFlag('rainbow');
|
||||||
|
//=> false
|
||||||
|
```
|
||||||
|
*/
|
||||||
|
declare function hasFlag(flag: string, argv?: string[]): boolean;
|
||||||
|
|
||||||
|
export = hasFlag;
|
||||||
@ -0,0 +1,25 @@
|
|||||||
|
export * from "./bigint.cjs";
|
||||||
|
export * from "./binary.cjs";
|
||||||
|
export * from "./boolean.cjs";
|
||||||
|
export * from "./char.cjs";
|
||||||
|
export * from "./common.cjs";
|
||||||
|
export * from "./custom.cjs";
|
||||||
|
export * from "./date.cjs";
|
||||||
|
export * from "./datetime.cjs";
|
||||||
|
export * from "./decimal.cjs";
|
||||||
|
export * from "./double.cjs";
|
||||||
|
export * from "./enum.cjs";
|
||||||
|
export * from "./float.cjs";
|
||||||
|
export * from "./int.cjs";
|
||||||
|
export * from "./json.cjs";
|
||||||
|
export * from "./mediumint.cjs";
|
||||||
|
export * from "./real.cjs";
|
||||||
|
export * from "./serial.cjs";
|
||||||
|
export * from "./smallint.cjs";
|
||||||
|
export * from "./text.cjs";
|
||||||
|
export * from "./time.cjs";
|
||||||
|
export * from "./timestamp.cjs";
|
||||||
|
export * from "./tinyint.cjs";
|
||||||
|
export * from "./varbinary.cjs";
|
||||||
|
export * from "./varchar.cjs";
|
||||||
|
export * from "./year.cjs";
|
||||||
@ -0,0 +1,61 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { MonoTypeOperatorFunction, ObservableInput } from '../types';
|
||||||
|
/**
|
||||||
|
* Returns an Observable that mirrors the source Observable with the exception of an `error`. If the source Observable
|
||||||
|
* calls `error`, this method will emit the Throwable that caused the error to the `ObservableInput` returned from `notifier`.
|
||||||
|
* If that Observable calls `complete` or `error` then this method will call `complete` or `error` on the child
|
||||||
|
* subscription. Otherwise this method will resubscribe to the source Observable.
|
||||||
|
*
|
||||||
|
* 
|
||||||
|
*
|
||||||
|
* Retry an observable sequence on error based on custom criteria.
|
||||||
|
*
|
||||||
|
* ## Example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* import { interval, map, retryWhen, tap, delayWhen, timer } from 'rxjs';
|
||||||
|
*
|
||||||
|
* const source = interval(1000);
|
||||||
|
* const result = source.pipe(
|
||||||
|
* map(value => {
|
||||||
|
* if (value > 5) {
|
||||||
|
* // error will be picked up by retryWhen
|
||||||
|
* throw value;
|
||||||
|
* }
|
||||||
|
* return value;
|
||||||
|
* }),
|
||||||
|
* retryWhen(errors =>
|
||||||
|
* errors.pipe(
|
||||||
|
* // log error message
|
||||||
|
* tap(value => console.log(`Value ${ value } was too high!`)),
|
||||||
|
* // restart in 5 seconds
|
||||||
|
* delayWhen(value => timer(value * 1000))
|
||||||
|
* )
|
||||||
|
* )
|
||||||
|
* );
|
||||||
|
*
|
||||||
|
* result.subscribe(value => console.log(value));
|
||||||
|
*
|
||||||
|
* // results:
|
||||||
|
* // 0
|
||||||
|
* // 1
|
||||||
|
* // 2
|
||||||
|
* // 3
|
||||||
|
* // 4
|
||||||
|
* // 5
|
||||||
|
* // 'Value 6 was too high!'
|
||||||
|
* // - Wait 5 seconds then repeat
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* @see {@link retry}
|
||||||
|
*
|
||||||
|
* @param notifier Function that receives an Observable of notifications with which a
|
||||||
|
* user can `complete` or `error`, aborting the retry.
|
||||||
|
* @return A function that returns an Observable that mirrors the source
|
||||||
|
* Observable with the exception of an `error`.
|
||||||
|
* @deprecated Will be removed in v9 or v10, use {@link retry}'s `delay` option instead.
|
||||||
|
* Will be removed in v9 or v10. Use {@link retry}'s {@link RetryConfig#delay delay} option instead.
|
||||||
|
* Instead of `retryWhen(() => notify$)`, use: `retry({ delay: () => notify$ })`.
|
||||||
|
*/
|
||||||
|
export declare function retryWhen<T>(notifier: (errors: Observable<any>) => ObservableInput<any>): MonoTypeOperatorFunction<T>;
|
||||||
|
//# sourceMappingURL=retryWhen.d.ts.map
|
||||||
@ -0,0 +1,633 @@
|
|||||||
|
"use strict";
|
||||||
|
var __create = Object.create;
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __getProtoOf = Object.getPrototypeOf;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||||
|
// If the importer is in node compatibility mode or this is not an ESM
|
||||||
|
// file that has been converted to a CommonJS file using a Babel-
|
||||||
|
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||||
|
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||||
|
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||||
|
mod
|
||||||
|
));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var base_exports = {};
|
||||||
|
__export(base_exports, {
|
||||||
|
TerminalReporter: () => TerminalReporter,
|
||||||
|
fitToWidth: () => fitToWidth,
|
||||||
|
formatError: () => formatError,
|
||||||
|
formatFailure: () => formatFailure,
|
||||||
|
formatResultFailure: () => formatResultFailure,
|
||||||
|
formatRetry: () => formatRetry,
|
||||||
|
internalScreen: () => internalScreen,
|
||||||
|
kOutputSymbol: () => kOutputSymbol,
|
||||||
|
markErrorsAsReported: () => markErrorsAsReported,
|
||||||
|
nonTerminalScreen: () => nonTerminalScreen,
|
||||||
|
prepareErrorStack: () => prepareErrorStack,
|
||||||
|
relativeFilePath: () => relativeFilePath,
|
||||||
|
resolveOutputFile: () => resolveOutputFile,
|
||||||
|
separator: () => separator,
|
||||||
|
stepSuffix: () => stepSuffix,
|
||||||
|
terminalScreen: () => terminalScreen
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(base_exports);
|
||||||
|
var import_path = __toESM(require("path"));
|
||||||
|
var import_utils = require("playwright-core/lib/utils");
|
||||||
|
var import_utils2 = require("playwright-core/lib/utils");
|
||||||
|
var import_util = require("../util");
|
||||||
|
var import_utilsBundle = require("../utilsBundle");
|
||||||
|
const kOutputSymbol = Symbol("output");
|
||||||
|
const DEFAULT_TTY_WIDTH = 100;
|
||||||
|
const DEFAULT_TTY_HEIGHT = 40;
|
||||||
|
const originalProcessStdout = process.stdout;
|
||||||
|
const originalProcessStderr = process.stderr;
|
||||||
|
const terminalScreen = (() => {
|
||||||
|
let isTTY = !!originalProcessStdout.isTTY;
|
||||||
|
let ttyWidth = originalProcessStdout.columns || 0;
|
||||||
|
let ttyHeight = originalProcessStdout.rows || 0;
|
||||||
|
if (process.env.PLAYWRIGHT_FORCE_TTY === "false" || process.env.PLAYWRIGHT_FORCE_TTY === "0") {
|
||||||
|
isTTY = false;
|
||||||
|
ttyWidth = 0;
|
||||||
|
ttyHeight = 0;
|
||||||
|
} else if (process.env.PLAYWRIGHT_FORCE_TTY === "true" || process.env.PLAYWRIGHT_FORCE_TTY === "1") {
|
||||||
|
isTTY = true;
|
||||||
|
ttyWidth = originalProcessStdout.columns || DEFAULT_TTY_WIDTH;
|
||||||
|
ttyHeight = originalProcessStdout.rows || DEFAULT_TTY_HEIGHT;
|
||||||
|
} else if (process.env.PLAYWRIGHT_FORCE_TTY) {
|
||||||
|
isTTY = true;
|
||||||
|
const sizeMatch = process.env.PLAYWRIGHT_FORCE_TTY.match(/^(\d+)x(\d+)$/);
|
||||||
|
if (sizeMatch) {
|
||||||
|
ttyWidth = +sizeMatch[1];
|
||||||
|
ttyHeight = +sizeMatch[2];
|
||||||
|
} else {
|
||||||
|
ttyWidth = +process.env.PLAYWRIGHT_FORCE_TTY;
|
||||||
|
ttyHeight = DEFAULT_TTY_HEIGHT;
|
||||||
|
}
|
||||||
|
if (isNaN(ttyWidth))
|
||||||
|
ttyWidth = DEFAULT_TTY_WIDTH;
|
||||||
|
if (isNaN(ttyHeight))
|
||||||
|
ttyHeight = DEFAULT_TTY_HEIGHT;
|
||||||
|
}
|
||||||
|
let useColors = isTTY;
|
||||||
|
if (process.env.DEBUG_COLORS === "0" || process.env.DEBUG_COLORS === "false" || process.env.FORCE_COLOR === "0" || process.env.FORCE_COLOR === "false")
|
||||||
|
useColors = false;
|
||||||
|
else if (process.env.DEBUG_COLORS || process.env.FORCE_COLOR)
|
||||||
|
useColors = true;
|
||||||
|
const colors = useColors ? import_utils2.colors : import_utils2.noColors;
|
||||||
|
return {
|
||||||
|
resolveFiles: "cwd",
|
||||||
|
isTTY,
|
||||||
|
ttyWidth,
|
||||||
|
ttyHeight,
|
||||||
|
colors,
|
||||||
|
stdout: originalProcessStdout,
|
||||||
|
stderr: originalProcessStderr
|
||||||
|
};
|
||||||
|
})();
|
||||||
|
const nonTerminalScreen = {
|
||||||
|
colors: terminalScreen.colors,
|
||||||
|
isTTY: false,
|
||||||
|
ttyWidth: 0,
|
||||||
|
ttyHeight: 0,
|
||||||
|
resolveFiles: "rootDir"
|
||||||
|
};
|
||||||
|
const internalScreen = {
|
||||||
|
colors: import_utils2.colors,
|
||||||
|
isTTY: false,
|
||||||
|
ttyWidth: 0,
|
||||||
|
ttyHeight: 0,
|
||||||
|
resolveFiles: "rootDir"
|
||||||
|
};
|
||||||
|
class TerminalReporter {
|
||||||
|
constructor(options = {}) {
|
||||||
|
this.totalTestCount = 0;
|
||||||
|
this.fileDurations = /* @__PURE__ */ new Map();
|
||||||
|
this._fatalErrors = [];
|
||||||
|
this._failureCount = 0;
|
||||||
|
this.screen = options.screen ?? terminalScreen;
|
||||||
|
this._options = options;
|
||||||
|
}
|
||||||
|
version() {
|
||||||
|
return "v2";
|
||||||
|
}
|
||||||
|
onConfigure(config) {
|
||||||
|
this.config = config;
|
||||||
|
}
|
||||||
|
onBegin(suite) {
|
||||||
|
this.suite = suite;
|
||||||
|
this.totalTestCount = suite.allTests().length;
|
||||||
|
}
|
||||||
|
onStdOut(chunk, test, result) {
|
||||||
|
this._appendOutput({ chunk, type: "stdout" }, result);
|
||||||
|
}
|
||||||
|
onStdErr(chunk, test, result) {
|
||||||
|
this._appendOutput({ chunk, type: "stderr" }, result);
|
||||||
|
}
|
||||||
|
_appendOutput(output, result) {
|
||||||
|
if (!result)
|
||||||
|
return;
|
||||||
|
result[kOutputSymbol] = result[kOutputSymbol] || [];
|
||||||
|
result[kOutputSymbol].push(output);
|
||||||
|
}
|
||||||
|
onTestEnd(test, result) {
|
||||||
|
if (result.status !== "skipped" && result.status !== test.expectedStatus)
|
||||||
|
++this._failureCount;
|
||||||
|
const projectName = test.titlePath()[1];
|
||||||
|
const relativePath = relativeTestPath(this.screen, this.config, test);
|
||||||
|
const fileAndProject = (projectName ? `[${projectName}] \u203A ` : "") + relativePath;
|
||||||
|
const entry = this.fileDurations.get(fileAndProject) || { duration: 0, workers: /* @__PURE__ */ new Set() };
|
||||||
|
entry.duration += result.duration;
|
||||||
|
entry.workers.add(result.workerIndex);
|
||||||
|
this.fileDurations.set(fileAndProject, entry);
|
||||||
|
}
|
||||||
|
onError(error) {
|
||||||
|
this._fatalErrors.push(error);
|
||||||
|
}
|
||||||
|
async onEnd(result) {
|
||||||
|
this.result = result;
|
||||||
|
}
|
||||||
|
fitToScreen(line, prefix) {
|
||||||
|
if (!this.screen.ttyWidth) {
|
||||||
|
return line;
|
||||||
|
}
|
||||||
|
return fitToWidth(line, this.screen.ttyWidth, prefix);
|
||||||
|
}
|
||||||
|
generateStartingMessage() {
|
||||||
|
const jobs = this.config.metadata.actualWorkers ?? this.config.workers;
|
||||||
|
const shardDetails = this.config.shard ? `, shard ${this.config.shard.current} of ${this.config.shard.total}` : "";
|
||||||
|
if (!this.totalTestCount)
|
||||||
|
return "";
|
||||||
|
return "\n" + this.screen.colors.dim("Running ") + this.totalTestCount + this.screen.colors.dim(` test${this.totalTestCount !== 1 ? "s" : ""} using `) + jobs + this.screen.colors.dim(` worker${jobs !== 1 ? "s" : ""}${shardDetails}`);
|
||||||
|
}
|
||||||
|
getSlowTests() {
|
||||||
|
if (!this.config.reportSlowTests)
|
||||||
|
return [];
|
||||||
|
const fileDurations = [...this.fileDurations.entries()].filter(([key, value]) => value.workers.size === 1).map(([key, value]) => [key, value.duration]);
|
||||||
|
fileDurations.sort((a, b) => b[1] - a[1]);
|
||||||
|
const count = Math.min(fileDurations.length, this.config.reportSlowTests.max || Number.POSITIVE_INFINITY);
|
||||||
|
const threshold = this.config.reportSlowTests.threshold;
|
||||||
|
return fileDurations.filter(([, duration]) => duration > threshold).slice(0, count);
|
||||||
|
}
|
||||||
|
generateSummaryMessage({ didNotRun, skipped, expected, interrupted, unexpected, flaky, fatalErrors }) {
|
||||||
|
const tokens = [];
|
||||||
|
if (unexpected.length) {
|
||||||
|
tokens.push(this.screen.colors.red(` ${unexpected.length} failed`));
|
||||||
|
for (const test of unexpected)
|
||||||
|
tokens.push(this.screen.colors.red(this.formatTestHeader(test, { indent: " " })));
|
||||||
|
}
|
||||||
|
if (interrupted.length) {
|
||||||
|
tokens.push(this.screen.colors.yellow(` ${interrupted.length} interrupted`));
|
||||||
|
for (const test of interrupted)
|
||||||
|
tokens.push(this.screen.colors.yellow(this.formatTestHeader(test, { indent: " " })));
|
||||||
|
}
|
||||||
|
if (flaky.length) {
|
||||||
|
tokens.push(this.screen.colors.yellow(` ${flaky.length} flaky`));
|
||||||
|
for (const test of flaky)
|
||||||
|
tokens.push(this.screen.colors.yellow(this.formatTestHeader(test, { indent: " " })));
|
||||||
|
}
|
||||||
|
if (skipped)
|
||||||
|
tokens.push(this.screen.colors.yellow(` ${skipped} skipped`));
|
||||||
|
if (didNotRun)
|
||||||
|
tokens.push(this.screen.colors.yellow(` ${didNotRun} did not run`));
|
||||||
|
if (expected)
|
||||||
|
tokens.push(this.screen.colors.green(` ${expected} passed`) + this.screen.colors.dim(` (${(0, import_utils.msToString)(this.result.duration)})`));
|
||||||
|
if (fatalErrors.length && expected + unexpected.length + interrupted.length + flaky.length > 0)
|
||||||
|
tokens.push(this.screen.colors.red(` ${fatalErrors.length === 1 ? "1 error was not a part of any test" : fatalErrors.length + " errors were not a part of any test"}, see above for details`));
|
||||||
|
return tokens.join("\n");
|
||||||
|
}
|
||||||
|
generateSummary() {
|
||||||
|
let didNotRun = 0;
|
||||||
|
let skipped = 0;
|
||||||
|
let expected = 0;
|
||||||
|
const interrupted = [];
|
||||||
|
const interruptedToPrint = [];
|
||||||
|
const unexpected = [];
|
||||||
|
const flaky = [];
|
||||||
|
this.suite.allTests().forEach((test) => {
|
||||||
|
switch (test.outcome()) {
|
||||||
|
case "skipped": {
|
||||||
|
if (test.results.some((result) => result.status === "interrupted")) {
|
||||||
|
if (test.results.some((result) => !!result.error))
|
||||||
|
interruptedToPrint.push(test);
|
||||||
|
interrupted.push(test);
|
||||||
|
} else if (!test.results.length || test.expectedStatus !== "skipped") {
|
||||||
|
++didNotRun;
|
||||||
|
} else {
|
||||||
|
++skipped;
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case "expected":
|
||||||
|
++expected;
|
||||||
|
break;
|
||||||
|
case "unexpected":
|
||||||
|
unexpected.push(test);
|
||||||
|
break;
|
||||||
|
case "flaky":
|
||||||
|
flaky.push(test);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
const failuresToPrint = [...unexpected, ...flaky, ...interruptedToPrint];
|
||||||
|
return {
|
||||||
|
didNotRun,
|
||||||
|
skipped,
|
||||||
|
expected,
|
||||||
|
interrupted,
|
||||||
|
unexpected,
|
||||||
|
flaky,
|
||||||
|
failuresToPrint,
|
||||||
|
fatalErrors: this._fatalErrors
|
||||||
|
};
|
||||||
|
}
|
||||||
|
epilogue(full) {
|
||||||
|
const summary = this.generateSummary();
|
||||||
|
const summaryMessage = this.generateSummaryMessage(summary);
|
||||||
|
if (full && summary.failuresToPrint.length && !this._options.omitFailures)
|
||||||
|
this._printFailures(summary.failuresToPrint);
|
||||||
|
this._printSlowTests();
|
||||||
|
this._printSummary(summaryMessage);
|
||||||
|
}
|
||||||
|
_printFailures(failures) {
|
||||||
|
this.writeLine("");
|
||||||
|
failures.forEach((test, index) => {
|
||||||
|
this.writeLine(this.formatFailure(test, index + 1));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
_printSlowTests() {
|
||||||
|
const slowTests = this.getSlowTests();
|
||||||
|
slowTests.forEach(([file, duration]) => {
|
||||||
|
this.writeLine(this.screen.colors.yellow(" Slow test file: ") + file + this.screen.colors.yellow(` (${(0, import_utils.msToString)(duration)})`));
|
||||||
|
});
|
||||||
|
if (slowTests.length)
|
||||||
|
this.writeLine(this.screen.colors.yellow(" Consider running tests from slow files in parallel. See: https://playwright.dev/docs/test-parallel"));
|
||||||
|
}
|
||||||
|
_printSummary(summary) {
|
||||||
|
if (summary.trim())
|
||||||
|
this.writeLine(summary);
|
||||||
|
}
|
||||||
|
willRetry(test) {
|
||||||
|
return test.outcome() === "unexpected" && test.results.length <= test.retries;
|
||||||
|
}
|
||||||
|
formatTestTitle(test, step) {
|
||||||
|
return formatTestTitle(this.screen, this.config, test, step, this._options);
|
||||||
|
}
|
||||||
|
formatTestHeader(test, options = {}) {
|
||||||
|
return formatTestHeader(this.screen, this.config, test, { ...options, includeTestId: this._options.includeTestId });
|
||||||
|
}
|
||||||
|
formatFailure(test, index) {
|
||||||
|
return formatFailure(this.screen, this.config, test, index, this._options);
|
||||||
|
}
|
||||||
|
formatError(error) {
|
||||||
|
return formatError(this.screen, error);
|
||||||
|
}
|
||||||
|
formatResultErrors(test, result) {
|
||||||
|
return formatResultErrors(this.screen, test, result);
|
||||||
|
}
|
||||||
|
writeLine(line) {
|
||||||
|
this.screen.stdout?.write(line ? line + "\n" : "\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function formatResultErrors(screen, test, result) {
|
||||||
|
const lines = [];
|
||||||
|
if (test.outcome() === "unexpected") {
|
||||||
|
const errorDetails = formatResultFailure(screen, test, result, " ");
|
||||||
|
if (errorDetails.length > 0)
|
||||||
|
lines.push("");
|
||||||
|
for (const error of errorDetails)
|
||||||
|
lines.push(error.message, "");
|
||||||
|
}
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
function formatFailure(screen, config, test, index, options) {
|
||||||
|
const lines = [];
|
||||||
|
let printedHeader = false;
|
||||||
|
for (const result of test.results) {
|
||||||
|
const resultLines = [];
|
||||||
|
const errors = formatResultFailure(screen, test, result, " ");
|
||||||
|
if (!errors.length)
|
||||||
|
continue;
|
||||||
|
if (!printedHeader) {
|
||||||
|
const header = formatTestHeader(screen, config, test, { indent: " ", index, mode: "error", includeTestId: options?.includeTestId });
|
||||||
|
lines.push(screen.colors.red(header));
|
||||||
|
printedHeader = true;
|
||||||
|
}
|
||||||
|
if (result.retry) {
|
||||||
|
resultLines.push("");
|
||||||
|
resultLines.push(screen.colors.gray(separator(screen, ` Retry #${result.retry}`)));
|
||||||
|
}
|
||||||
|
resultLines.push(...errors.map((error) => "\n" + error.message));
|
||||||
|
const attachmentGroups = groupAttachments(result.attachments);
|
||||||
|
for (let i = 0; i < attachmentGroups.length; ++i) {
|
||||||
|
const attachment = attachmentGroups[i];
|
||||||
|
if (attachment.name === "error-context" && attachment.path) {
|
||||||
|
resultLines.push("");
|
||||||
|
resultLines.push(screen.colors.dim(` Error Context: ${relativeFilePath(screen, config, attachment.path)}`));
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (attachment.name.startsWith("_"))
|
||||||
|
continue;
|
||||||
|
const hasPrintableContent = attachment.contentType.startsWith("text/");
|
||||||
|
if (!attachment.path && !hasPrintableContent)
|
||||||
|
continue;
|
||||||
|
resultLines.push("");
|
||||||
|
resultLines.push(screen.colors.dim(separator(screen, ` attachment #${i + 1}: ${screen.colors.bold(attachment.name)} (${attachment.contentType})`)));
|
||||||
|
if (attachment.actual?.path) {
|
||||||
|
if (attachment.expected?.path) {
|
||||||
|
const expectedPath = relativeFilePath(screen, config, attachment.expected.path);
|
||||||
|
resultLines.push(screen.colors.dim(` Expected: ${expectedPath}`));
|
||||||
|
}
|
||||||
|
const actualPath = relativeFilePath(screen, config, attachment.actual.path);
|
||||||
|
resultLines.push(screen.colors.dim(` Received: ${actualPath}`));
|
||||||
|
if (attachment.previous?.path) {
|
||||||
|
const previousPath = relativeFilePath(screen, config, attachment.previous.path);
|
||||||
|
resultLines.push(screen.colors.dim(` Previous: ${previousPath}`));
|
||||||
|
}
|
||||||
|
if (attachment.diff?.path) {
|
||||||
|
const diffPath = relativeFilePath(screen, config, attachment.diff.path);
|
||||||
|
resultLines.push(screen.colors.dim(` Diff: ${diffPath}`));
|
||||||
|
}
|
||||||
|
} else if (attachment.path) {
|
||||||
|
const relativePath = relativeFilePath(screen, config, attachment.path);
|
||||||
|
resultLines.push(screen.colors.dim(` ${relativePath}`));
|
||||||
|
if (attachment.name === "trace") {
|
||||||
|
const packageManagerCommand = (0, import_utils.getPackageManagerExecCommand)();
|
||||||
|
resultLines.push(screen.colors.dim(` Usage:`));
|
||||||
|
resultLines.push("");
|
||||||
|
resultLines.push(screen.colors.dim(` ${packageManagerCommand} playwright show-trace ${quotePathIfNeeded(relativePath)}`));
|
||||||
|
resultLines.push("");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (attachment.contentType.startsWith("text/") && attachment.body) {
|
||||||
|
let text = attachment.body.toString();
|
||||||
|
if (text.length > 300)
|
||||||
|
text = text.slice(0, 300) + "...";
|
||||||
|
for (const line of text.split("\n"))
|
||||||
|
resultLines.push(screen.colors.dim(` ${line}`));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
resultLines.push(screen.colors.dim(separator(screen, " ")));
|
||||||
|
}
|
||||||
|
lines.push(...resultLines);
|
||||||
|
}
|
||||||
|
lines.push("");
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
function formatRetry(screen, result) {
|
||||||
|
const retryLines = [];
|
||||||
|
if (result.retry) {
|
||||||
|
retryLines.push("");
|
||||||
|
retryLines.push(screen.colors.gray(separator(screen, ` Retry #${result.retry}`)));
|
||||||
|
}
|
||||||
|
return retryLines;
|
||||||
|
}
|
||||||
|
function quotePathIfNeeded(path2) {
|
||||||
|
if (/\s/.test(path2))
|
||||||
|
return `"${path2}"`;
|
||||||
|
return path2;
|
||||||
|
}
|
||||||
|
const kReportedSymbol = Symbol("reported");
|
||||||
|
function markErrorsAsReported(result) {
|
||||||
|
result[kReportedSymbol] = result.errors.length;
|
||||||
|
}
|
||||||
|
function formatResultFailure(screen, test, result, initialIndent) {
|
||||||
|
const errorDetails = [];
|
||||||
|
if (result.status === "passed" && test.expectedStatus === "failed") {
|
||||||
|
errorDetails.push({
|
||||||
|
message: indent(screen.colors.red(`Expected to fail, but passed.`), initialIndent)
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (result.status === "interrupted") {
|
||||||
|
errorDetails.push({
|
||||||
|
message: indent(screen.colors.red(`Test was interrupted.`), initialIndent)
|
||||||
|
});
|
||||||
|
}
|
||||||
|
const reportedIndex = result[kReportedSymbol] || 0;
|
||||||
|
for (const error of result.errors.slice(reportedIndex)) {
|
||||||
|
const formattedError = formatError(screen, error);
|
||||||
|
errorDetails.push({
|
||||||
|
message: indent(formattedError.message, initialIndent),
|
||||||
|
location: formattedError.location
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return errorDetails;
|
||||||
|
}
|
||||||
|
function relativeFilePath(screen, config, file) {
|
||||||
|
if (screen.resolveFiles === "cwd")
|
||||||
|
return import_path.default.relative(process.cwd(), file);
|
||||||
|
return import_path.default.relative(config.rootDir, file);
|
||||||
|
}
|
||||||
|
function relativeTestPath(screen, config, test) {
|
||||||
|
return relativeFilePath(screen, config, test.location.file);
|
||||||
|
}
|
||||||
|
function stepSuffix(step) {
|
||||||
|
const stepTitles = step ? step.titlePath() : [];
|
||||||
|
return stepTitles.map((t) => t.split("\n")[0]).map((t) => " \u203A " + t).join("");
|
||||||
|
}
|
||||||
|
function formatTestTitle(screen, config, test, step, options = {}) {
|
||||||
|
const [, projectName, , ...titles] = test.titlePath();
|
||||||
|
const location = `${relativeTestPath(screen, config, test)}:${test.location.line}:${test.location.column}`;
|
||||||
|
const testId = options.includeTestId ? `[id=${test.id}] ` : "";
|
||||||
|
const projectLabel = options.includeTestId ? `project=` : "";
|
||||||
|
const projectTitle = projectName ? `[${projectLabel}${projectName}] \u203A ` : "";
|
||||||
|
const testTitle = `${testId}${projectTitle}${location} \u203A ${titles.join(" \u203A ")}`;
|
||||||
|
const extraTags = test.tags.filter((t) => !testTitle.includes(t) && !config.tags.includes(t));
|
||||||
|
return `${testTitle}${stepSuffix(step)}${extraTags.length ? " " + extraTags.join(" ") : ""}`;
|
||||||
|
}
|
||||||
|
function formatTestHeader(screen, config, test, options = {}) {
|
||||||
|
const title = formatTestTitle(screen, config, test, void 0, options);
|
||||||
|
const header = `${options.indent || ""}${options.index ? options.index + ") " : ""}${title}`;
|
||||||
|
let fullHeader = header;
|
||||||
|
if (options.mode === "error") {
|
||||||
|
const stepPaths = /* @__PURE__ */ new Set();
|
||||||
|
for (const result of test.results.filter((r) => !!r.errors.length)) {
|
||||||
|
const stepPath = [];
|
||||||
|
const visit = (steps) => {
|
||||||
|
const errors = steps.filter((s) => s.error);
|
||||||
|
if (errors.length > 1)
|
||||||
|
return;
|
||||||
|
if (errors.length === 1 && errors[0].category === "test.step") {
|
||||||
|
stepPath.push(errors[0].title);
|
||||||
|
visit(errors[0].steps);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
visit(result.steps);
|
||||||
|
stepPaths.add(["", ...stepPath].join(" \u203A "));
|
||||||
|
}
|
||||||
|
fullHeader = header + (stepPaths.size === 1 ? stepPaths.values().next().value : "");
|
||||||
|
}
|
||||||
|
return separator(screen, fullHeader);
|
||||||
|
}
|
||||||
|
function formatError(screen, error) {
|
||||||
|
const message = error.message || error.value || "";
|
||||||
|
const stack = error.stack;
|
||||||
|
if (!stack && !error.location)
|
||||||
|
return { message };
|
||||||
|
const tokens = [];
|
||||||
|
const parsedStack = stack ? prepareErrorStack(stack) : void 0;
|
||||||
|
tokens.push(parsedStack?.message || message);
|
||||||
|
if (error.snippet) {
|
||||||
|
let snippet = error.snippet;
|
||||||
|
if (!screen.colors.enabled)
|
||||||
|
snippet = (0, import_util.stripAnsiEscapes)(snippet);
|
||||||
|
tokens.push("");
|
||||||
|
tokens.push(snippet);
|
||||||
|
}
|
||||||
|
if (parsedStack && parsedStack.stackLines.length)
|
||||||
|
tokens.push(screen.colors.dim(parsedStack.stackLines.join("\n")));
|
||||||
|
let location = error.location;
|
||||||
|
if (parsedStack && !location)
|
||||||
|
location = parsedStack.location;
|
||||||
|
if (error.cause)
|
||||||
|
tokens.push(screen.colors.dim("[cause]: ") + formatError(screen, error.cause).message);
|
||||||
|
return {
|
||||||
|
location,
|
||||||
|
message: tokens.join("\n")
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function separator(screen, text = "") {
|
||||||
|
if (text)
|
||||||
|
text += " ";
|
||||||
|
const columns = Math.min(100, screen.ttyWidth || 100);
|
||||||
|
return text + screen.colors.dim("\u2500".repeat(Math.max(0, columns - (0, import_util.stripAnsiEscapes)(text).length)));
|
||||||
|
}
|
||||||
|
function indent(lines, tab) {
|
||||||
|
return lines.replace(/^(?=.+$)/gm, tab);
|
||||||
|
}
|
||||||
|
function prepareErrorStack(stack) {
|
||||||
|
return (0, import_utils.parseErrorStack)(stack, import_path.default.sep, !!process.env.PWDEBUGIMPL);
|
||||||
|
}
|
||||||
|
function characterWidth(c) {
|
||||||
|
return import_utilsBundle.getEastAsianWidth.eastAsianWidth(c.codePointAt(0));
|
||||||
|
}
|
||||||
|
function stringWidth(v) {
|
||||||
|
let width = 0;
|
||||||
|
for (const { segment } of new Intl.Segmenter(void 0, { granularity: "grapheme" }).segment(v))
|
||||||
|
width += characterWidth(segment);
|
||||||
|
return width;
|
||||||
|
}
|
||||||
|
function suffixOfWidth(v, width) {
|
||||||
|
const segments = [...new Intl.Segmenter(void 0, { granularity: "grapheme" }).segment(v)];
|
||||||
|
let suffixBegin = v.length;
|
||||||
|
for (const { segment, index } of segments.reverse()) {
|
||||||
|
const segmentWidth = stringWidth(segment);
|
||||||
|
if (segmentWidth > width)
|
||||||
|
break;
|
||||||
|
width -= segmentWidth;
|
||||||
|
suffixBegin = index;
|
||||||
|
}
|
||||||
|
return v.substring(suffixBegin);
|
||||||
|
}
|
||||||
|
function fitToWidth(line, width, prefix) {
|
||||||
|
const prefixLength = prefix ? (0, import_util.stripAnsiEscapes)(prefix).length : 0;
|
||||||
|
width -= prefixLength;
|
||||||
|
if (stringWidth(line) <= width)
|
||||||
|
return line;
|
||||||
|
const parts = line.split(import_util.ansiRegex);
|
||||||
|
const taken = [];
|
||||||
|
for (let i = parts.length - 1; i >= 0; i--) {
|
||||||
|
if (i % 2) {
|
||||||
|
taken.push(parts[i]);
|
||||||
|
} else {
|
||||||
|
let part = suffixOfWidth(parts[i], width);
|
||||||
|
const wasTruncated = part.length < parts[i].length;
|
||||||
|
if (wasTruncated && parts[i].length > 0) {
|
||||||
|
part = "\u2026" + suffixOfWidth(parts[i], width - 1);
|
||||||
|
}
|
||||||
|
taken.push(part);
|
||||||
|
width -= stringWidth(part);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return taken.reverse().join("");
|
||||||
|
}
|
||||||
|
function resolveFromEnv(name) {
|
||||||
|
const value = process.env[name];
|
||||||
|
if (value)
|
||||||
|
return import_path.default.resolve(process.cwd(), value);
|
||||||
|
return void 0;
|
||||||
|
}
|
||||||
|
function resolveOutputFile(reporterName, options) {
|
||||||
|
const name = reporterName.toUpperCase();
|
||||||
|
let outputFile = resolveFromEnv(`PLAYWRIGHT_${name}_OUTPUT_FILE`);
|
||||||
|
if (!outputFile && options.outputFile)
|
||||||
|
outputFile = import_path.default.resolve(options.configDir, options.outputFile);
|
||||||
|
if (outputFile)
|
||||||
|
return { outputFile };
|
||||||
|
let outputDir = resolveFromEnv(`PLAYWRIGHT_${name}_OUTPUT_DIR`);
|
||||||
|
if (!outputDir && options.outputDir)
|
||||||
|
outputDir = import_path.default.resolve(options.configDir, options.outputDir);
|
||||||
|
if (!outputDir && options.default)
|
||||||
|
outputDir = (0, import_util.resolveReporterOutputPath)(options.default.outputDir, options.configDir, void 0);
|
||||||
|
if (!outputDir)
|
||||||
|
outputDir = options.configDir;
|
||||||
|
const reportName = process.env[`PLAYWRIGHT_${name}_OUTPUT_NAME`] ?? options.fileName ?? options.default?.fileName;
|
||||||
|
if (!reportName)
|
||||||
|
return void 0;
|
||||||
|
outputFile = import_path.default.resolve(outputDir, reportName);
|
||||||
|
return { outputFile, outputDir };
|
||||||
|
}
|
||||||
|
function groupAttachments(attachments) {
|
||||||
|
const result = [];
|
||||||
|
const attachmentsByPrefix = /* @__PURE__ */ new Map();
|
||||||
|
for (const attachment of attachments) {
|
||||||
|
if (!attachment.path) {
|
||||||
|
result.push(attachment);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const match = attachment.name.match(/^(.*)-(expected|actual|diff|previous)(\.[^.]+)?$/);
|
||||||
|
if (!match) {
|
||||||
|
result.push(attachment);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const [, name, category] = match;
|
||||||
|
let group = attachmentsByPrefix.get(name);
|
||||||
|
if (!group) {
|
||||||
|
group = { ...attachment, name };
|
||||||
|
attachmentsByPrefix.set(name, group);
|
||||||
|
result.push(group);
|
||||||
|
}
|
||||||
|
if (category === "expected")
|
||||||
|
group.expected = attachment;
|
||||||
|
else if (category === "actual")
|
||||||
|
group.actual = attachment;
|
||||||
|
else if (category === "diff")
|
||||||
|
group.diff = attachment;
|
||||||
|
else if (category === "previous")
|
||||||
|
group.previous = attachment;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
TerminalReporter,
|
||||||
|
fitToWidth,
|
||||||
|
formatError,
|
||||||
|
formatFailure,
|
||||||
|
formatResultFailure,
|
||||||
|
formatRetry,
|
||||||
|
internalScreen,
|
||||||
|
kOutputSymbol,
|
||||||
|
markErrorsAsReported,
|
||||||
|
nonTerminalScreen,
|
||||||
|
prepareErrorStack,
|
||||||
|
relativeFilePath,
|
||||||
|
resolveOutputFile,
|
||||||
|
separator,
|
||||||
|
stepSuffix,
|
||||||
|
terminalScreen
|
||||||
|
});
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/mysql-core/columns/serial.ts"],"sourcesContent":["import type {\n\tColumnBuilderBaseConfig,\n\tColumnBuilderRuntimeConfig,\n\tHasDefault,\n\tIsAutoincrement,\n\tIsPrimaryKey,\n\tMakeColumnConfig,\n\tNotNull,\n} from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnyMySqlTable } from '~/mysql-core/table.ts';\nimport { MySqlColumnBuilderWithAutoIncrement, MySqlColumnWithAutoIncrement } from './common.ts';\n\nexport type MySqlSerialBuilderInitial<TName extends string> = IsAutoincrement<\n\tIsPrimaryKey<\n\t\tNotNull<\n\t\t\tHasDefault<\n\t\t\t\tMySqlSerialBuilder<{\n\t\t\t\t\tname: TName;\n\t\t\t\t\tdataType: 'number';\n\t\t\t\t\tcolumnType: 'MySqlSerial';\n\t\t\t\t\tdata: number;\n\t\t\t\t\tdriverParam: number;\n\t\t\t\t\tenumValues: undefined;\n\t\t\t\t}>\n\t\t\t>\n\t\t>\n\t>\n>;\n\nexport class MySqlSerialBuilder<T extends ColumnBuilderBaseConfig<'number', 'MySqlSerial'>>\n\textends MySqlColumnBuilderWithAutoIncrement<T>\n{\n\tstatic override readonly [entityKind]: string = 'MySqlSerialBuilder';\n\n\tconstructor(name: T['name']) {\n\t\tsuper(name, 'number', 'MySqlSerial');\n\t\tthis.config.hasDefault = true;\n\t\tthis.config.autoIncrement = true;\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnyMySqlTable<{ name: TTableName }>,\n\t): MySqlSerial<MakeColumnConfig<T, TTableName>> {\n\t\treturn new MySqlSerial<MakeColumnConfig<T, TTableName>>(table, this.config as ColumnBuilderRuntimeConfig<any, any>);\n\t}\n}\n\nexport class MySqlSerial<\n\tT extends ColumnBaseConfig<'number', 'MySqlSerial'>,\n> extends MySqlColumnWithAutoIncrement<T> {\n\tstatic override readonly [entityKind]: string = 'MySqlSerial';\n\n\tgetSQLType(): string {\n\t\treturn 'serial';\n\t}\n\n\toverride mapFromDriverValue(value: number | string): number {\n\t\tif (typeof value === 'string') {\n\t\t\treturn Number(value);\n\t\t}\n\t\treturn value;\n\t}\n}\n\nexport function serial(): MySqlSerialBuilderInitial<''>;\nexport function serial<TName extends string>(name: TName): MySqlSerialBuilderInitial<TName>;\nexport function serial(name?: string) {\n\treturn new MySqlSerialBuilder(name ?? '');\n}\n"],"mappings":"AAUA,SAAS,kBAAkB;AAE3B,SAAS,qCAAqC,oCAAoC;AAmB3E,MAAM,2BACJ,oCACT;AAAA,EACC,QAA0B,UAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB;AAC5B,UAAM,MAAM,UAAU,aAAa;AACnC,SAAK,OAAO,aAAa;AACzB,SAAK,OAAO,gBAAgB;AAAA,EAC7B;AAAA;AAAA,EAGS,MACR,OAC+C;AAC/C,WAAO,IAAI,YAA6C,OAAO,KAAK,MAA8C;AAAA,EACnH;AACD;AAEO,MAAM,oBAEH,6BAAgC;AAAA,EACzC,QAA0B,UAAU,IAAY;AAAA,EAEhD,aAAqB;AACpB,WAAO;AAAA,EACR;AAAA,EAES,mBAAmB,OAAgC;AAC3D,QAAI,OAAO,UAAU,UAAU;AAC9B,aAAO,OAAO,KAAK;AAAA,IACpB;AACA,WAAO;AAAA,EACR;AACD;AAIO,SAAS,OAAO,MAAe;AACrC,SAAO,IAAI,mBAAmB,QAAQ,EAAE;AACzC;","names":[]}
|
||||||
@ -0,0 +1,796 @@
|
|||||||
|
import type { CacheConfig, WithCacheConfig } from "../../cache/core/types.cjs";
|
||||||
|
import { entityKind } from "../../entity.cjs";
|
||||||
|
import type { PgColumn } from "../columns/index.cjs";
|
||||||
|
import type { PgDialect } from "../dialect.cjs";
|
||||||
|
import type { PgSession } from "../session.cjs";
|
||||||
|
import type { SubqueryWithSelection } from "../subquery.cjs";
|
||||||
|
import type { PgTable } from "../table.cjs";
|
||||||
|
import { PgViewBase } from "../view-base.cjs";
|
||||||
|
import { TypedQueryBuilder } from "../../query-builders/query-builder.cjs";
|
||||||
|
import type { BuildSubquerySelection, GetSelectTableName, GetSelectTableSelection, JoinNullability, SelectMode, SelectResult } from "../../query-builders/select.types.cjs";
|
||||||
|
import { QueryPromise } from "../../query-promise.cjs";
|
||||||
|
import type { RunnableQuery } from "../../runnable-query.cjs";
|
||||||
|
import { SQL } from "../../sql/sql.cjs";
|
||||||
|
import type { ColumnsSelection, Placeholder, Query, SQLWrapper } from "../../sql/sql.cjs";
|
||||||
|
import { Subquery } from "../../subquery.cjs";
|
||||||
|
import { type DrizzleTypeError, type ValueOrArray } from "../../utils.cjs";
|
||||||
|
import type { CreatePgSelectFromBuilderMode, GetPgSetOperators, LockConfig, LockStrength, PgCreateSetOperatorFn, PgSelectConfig, PgSelectCrossJoinFn, PgSelectDynamic, PgSelectHKT, PgSelectHKTBase, PgSelectJoinFn, PgSelectPrepare, PgSelectWithout, PgSetOperatorExcludedMethods, PgSetOperatorWithResult, SelectedFields, SetOperatorRightSelect, TableLikeHasEmptySelection } from "./select.types.cjs";
|
||||||
|
export declare class PgSelectBuilder<TSelection extends SelectedFields | undefined, TBuilderMode extends 'db' | 'qb' = 'db'> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
private fields;
|
||||||
|
private session;
|
||||||
|
private dialect;
|
||||||
|
private withList;
|
||||||
|
private distinct;
|
||||||
|
constructor(config: {
|
||||||
|
fields: TSelection;
|
||||||
|
session: PgSession | undefined;
|
||||||
|
dialect: PgDialect;
|
||||||
|
withList?: Subquery[];
|
||||||
|
distinct?: boolean | {
|
||||||
|
on: (PgColumn | SQLWrapper)[];
|
||||||
|
};
|
||||||
|
});
|
||||||
|
private authToken?;
|
||||||
|
/**
|
||||||
|
* Specify the table, subquery, or other target that you're
|
||||||
|
* building a select query against.
|
||||||
|
*
|
||||||
|
* {@link https://www.postgresql.org/docs/current/sql-select.html#SQL-FROM | Postgres from documentation}
|
||||||
|
*/
|
||||||
|
from<TFrom extends PgTable | Subquery | PgViewBase | SQL>(source: TableLikeHasEmptySelection<TFrom> extends true ? DrizzleTypeError<"Cannot reference a data-modifying statement subquery if it doesn't contain a `returning` clause"> : TFrom): CreatePgSelectFromBuilderMode<TBuilderMode, GetSelectTableName<TFrom>, TSelection extends undefined ? GetSelectTableSelection<TFrom> : TSelection, TSelection extends undefined ? 'single' : 'partial'>;
|
||||||
|
}
|
||||||
|
export declare abstract class PgSelectQueryBuilderBase<THKT extends PgSelectHKTBase, TTableName extends string | undefined, TSelection extends ColumnsSelection, TSelectMode extends SelectMode, TNullabilityMap extends Record<string, JoinNullability> = TTableName extends string ? Record<TTableName, 'not-null'> : {}, TDynamic extends boolean = false, TExcludedMethods extends string = never, TResult extends any[] = SelectResult<TSelection, TSelectMode, TNullabilityMap>[], TSelectedFields extends ColumnsSelection = BuildSubquerySelection<TSelection, TNullabilityMap>> extends TypedQueryBuilder<TSelectedFields, TResult> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
readonly _: {
|
||||||
|
readonly dialect: 'pg';
|
||||||
|
readonly hkt: THKT;
|
||||||
|
readonly tableName: TTableName;
|
||||||
|
readonly selection: TSelection;
|
||||||
|
readonly selectMode: TSelectMode;
|
||||||
|
readonly nullabilityMap: TNullabilityMap;
|
||||||
|
readonly dynamic: TDynamic;
|
||||||
|
readonly excludedMethods: TExcludedMethods;
|
||||||
|
readonly result: TResult;
|
||||||
|
readonly selectedFields: TSelectedFields;
|
||||||
|
readonly config: PgSelectConfig;
|
||||||
|
};
|
||||||
|
protected config: PgSelectConfig;
|
||||||
|
protected joinsNotNullableMap: Record<string, boolean>;
|
||||||
|
protected tableName: string | undefined;
|
||||||
|
private isPartialSelect;
|
||||||
|
protected session: PgSession | undefined;
|
||||||
|
protected dialect: PgDialect;
|
||||||
|
protected cacheConfig?: WithCacheConfig;
|
||||||
|
protected usedTables: Set<string>;
|
||||||
|
constructor({ table, fields, isPartialSelect, session, dialect, withList, distinct }: {
|
||||||
|
table: PgSelectConfig['table'];
|
||||||
|
fields: PgSelectConfig['fields'];
|
||||||
|
isPartialSelect: boolean;
|
||||||
|
session: PgSession | undefined;
|
||||||
|
dialect: PgDialect;
|
||||||
|
withList: Subquery[];
|
||||||
|
distinct: boolean | {
|
||||||
|
on: (PgColumn | SQLWrapper)[];
|
||||||
|
} | undefined;
|
||||||
|
});
|
||||||
|
private createJoin;
|
||||||
|
/**
|
||||||
|
* Executes a `left join` operation by adding another table to the current query.
|
||||||
|
*
|
||||||
|
* Calling this method associates each row of the table with the corresponding row from the joined table, if a match is found. If no matching row exists, it sets all columns of the joined table to null.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#left-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User; pets: Pet | null; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .leftJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number | null; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .leftJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
leftJoin: PgSelectJoinFn<this, TDynamic, "left", false>;
|
||||||
|
/**
|
||||||
|
* Executes a `left join lateral` operation by adding subquery to the current query.
|
||||||
|
*
|
||||||
|
* A `lateral` join allows the right-hand expression to refer to columns from the left-hand side.
|
||||||
|
*
|
||||||
|
* Calling this method associates each row of the table with the corresponding row from the joined table, if a match is found. If no matching row exists, it sets all columns of the joined table to null.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#left-join-lateral}
|
||||||
|
*
|
||||||
|
* @param table the subquery to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*/
|
||||||
|
leftJoinLateral: PgSelectJoinFn<this, TDynamic, "left", true>;
|
||||||
|
/**
|
||||||
|
* Executes a `right join` operation by adding another table to the current query.
|
||||||
|
*
|
||||||
|
* Calling this method associates each row of the joined table with the corresponding row from the main table, if a match is found. If no matching row exists, it sets all columns of the main table to null.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#right-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User | null; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .rightJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number | null; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .rightJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
rightJoin: PgSelectJoinFn<this, TDynamic, "right", false>;
|
||||||
|
/**
|
||||||
|
* Executes an `inner join` operation, creating a new table by combining rows from two tables that have matching values.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves rows that have corresponding entries in both joined tables. Rows without matching entries in either table are excluded, resulting in a table that includes only matching pairs.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#inner-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .innerJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .innerJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
innerJoin: PgSelectJoinFn<this, TDynamic, "inner", false>;
|
||||||
|
/**
|
||||||
|
* Executes an `inner join lateral` operation, creating a new table by combining rows from two queries that have matching values.
|
||||||
|
*
|
||||||
|
* A `lateral` join allows the right-hand expression to refer to columns from the left-hand side.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves rows that have corresponding entries in both joined tables. Rows without matching entries in either table are excluded, resulting in a table that includes only matching pairs.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#inner-join-lateral}
|
||||||
|
*
|
||||||
|
* @param table the subquery to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*/
|
||||||
|
innerJoinLateral: PgSelectJoinFn<this, TDynamic, "inner", true>;
|
||||||
|
/**
|
||||||
|
* Executes a `full join` operation by combining rows from two tables into a new table.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves all rows from both main and joined tables, merging rows with matching values and filling in `null` for non-matching columns.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#full-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User | null; pets: Pet | null; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .fullJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number | null; petId: number | null; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .fullJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
fullJoin: PgSelectJoinFn<this, TDynamic, "full", false>;
|
||||||
|
/**
|
||||||
|
* Executes a `cross join` operation by combining rows from two tables into a new table.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves all rows from both main and joined tables, merging all rows from each table.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#cross-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users, each user with every pet
|
||||||
|
* const usersWithPets: { user: User; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .crossJoin(pets)
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .crossJoin(pets)
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
crossJoin: PgSelectCrossJoinFn<this, TDynamic, false>;
|
||||||
|
/**
|
||||||
|
* Executes a `cross join lateral` operation by combining rows from two queries into a new table.
|
||||||
|
*
|
||||||
|
* A `lateral` join allows the right-hand expression to refer to columns from the left-hand side.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves all rows from both main and joined queries, merging all rows from each query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#cross-join-lateral}
|
||||||
|
*
|
||||||
|
* @param table the query to join.
|
||||||
|
*/
|
||||||
|
crossJoinLateral: PgSelectCrossJoinFn<this, TDynamic, true>;
|
||||||
|
private createSetOperator;
|
||||||
|
/**
|
||||||
|
* Adds `union` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result sets of the `select` statements and remove any duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all unique names from customers and users tables
|
||||||
|
* await db.select({ name: users.name })
|
||||||
|
* .from(users)
|
||||||
|
* .union(
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { union } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await union(
|
||||||
|
* db.select({ name: users.name }).from(users),
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
union: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds `union all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result-set of the `select` statements and keep all duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all transaction ids from both online and in-store sales
|
||||||
|
* await db.select({ transaction: onlineSales.transactionId })
|
||||||
|
* .from(onlineSales)
|
||||||
|
* .unionAll(
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { unionAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await unionAll(
|
||||||
|
* db.select({ transaction: onlineSales.transactionId }).from(onlineSales),
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
unionAll: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds `intersect` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retain only the rows that are present in both result sets and eliminate duplicates.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#intersect}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select course names that are offered in both departments A and B
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .intersect(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { intersect } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await intersect(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
intersect: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds `intersect all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retain only the rows that are present in both result sets including all duplicates.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#intersect-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all products and quantities that are ordered by both regular and VIP customers
|
||||||
|
* await db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders)
|
||||||
|
* .intersectAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { intersectAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await intersectAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders),
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
intersectAll: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds `except` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retrieve all unique rows from the left query, except for the rows that are present in the result set of the right query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#except}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all courses offered in department A but not in department B
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .except(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { except } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await except(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
except: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds `except all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retrieve all rows from the left query, except for the rows that are present in the result set of the right query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#except-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all products that are ordered by regular customers but not by VIP customers
|
||||||
|
* await db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered,
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders)
|
||||||
|
* .exceptAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered,
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { exceptAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await exceptAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders),
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
exceptAll: <TValue extends PgSetOperatorWithResult<TResult>>(rightSelection: ((setOperators: GetPgSetOperators) => SetOperatorRightSelect<TValue, TResult>) | SetOperatorRightSelect<TValue, TResult>) => PgSelectWithout<this, TDynamic, PgSetOperatorExcludedMethods, true>;
|
||||||
|
/**
|
||||||
|
* Adds a `where` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will select only those rows that fulfill a specified condition.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#filtering}
|
||||||
|
*
|
||||||
|
* @param where the `where` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* You can use conditional operators and `sql function` to filter the rows to be selected.
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all cars with green color
|
||||||
|
* await db.select().from(cars).where(eq(cars.color, 'green'));
|
||||||
|
* // or
|
||||||
|
* await db.select().from(cars).where(sql`${cars.color} = 'green'`)
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* You can logically combine conditional operators with `and()` and `or()` operators:
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all BMW cars with a green color
|
||||||
|
* await db.select().from(cars).where(and(eq(cars.color, 'green'), eq(cars.brand, 'BMW')));
|
||||||
|
*
|
||||||
|
* // Select all cars with the green or blue color
|
||||||
|
* await db.select().from(cars).where(or(eq(cars.color, 'green'), eq(cars.color, 'blue')));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
where(where: ((aliases: this['_']['selection']) => SQL | undefined) | SQL | undefined): PgSelectWithout<this, TDynamic, 'where'>;
|
||||||
|
/**
|
||||||
|
* Adds a `having` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will select only those rows that fulfill a specified condition. It is typically used with aggregate functions to filter the aggregated data based on a specified condition.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#aggregations}
|
||||||
|
*
|
||||||
|
* @param having the `having` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all brands with more than one car
|
||||||
|
* await db.select({
|
||||||
|
* brand: cars.brand,
|
||||||
|
* count: sql<number>`cast(count(${cars.id}) as int)`,
|
||||||
|
* })
|
||||||
|
* .from(cars)
|
||||||
|
* .groupBy(cars.brand)
|
||||||
|
* .having(({ count }) => gt(count, 1));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
having(having: ((aliases: this['_']['selection']) => SQL | undefined) | SQL | undefined): PgSelectWithout<this, TDynamic, 'having'>;
|
||||||
|
/**
|
||||||
|
* Adds a `group by` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will group rows that have the same values into summary rows, often used for aggregation purposes.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#aggregations}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Group and count people by their last names
|
||||||
|
* await db.select({
|
||||||
|
* lastName: people.lastName,
|
||||||
|
* count: sql<number>`cast(count(*) as int)`
|
||||||
|
* })
|
||||||
|
* .from(people)
|
||||||
|
* .groupBy(people.lastName);
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
groupBy(builder: (aliases: this['_']['selection']) => ValueOrArray<PgColumn | SQL | SQL.Aliased>): PgSelectWithout<this, TDynamic, 'groupBy'>;
|
||||||
|
groupBy(...columns: (PgColumn | SQL | SQL.Aliased)[]): PgSelectWithout<this, TDynamic, 'groupBy'>;
|
||||||
|
/**
|
||||||
|
* Adds an `order by` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will sort the result-set in ascending or descending order. By default, the sort order is ascending.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#order-by}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```
|
||||||
|
* // Select cars ordered by year
|
||||||
|
* await db.select().from(cars).orderBy(cars.year);
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* You can specify whether results are in ascending or descending order with the `asc()` and `desc()` operators.
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select cars ordered by year in descending order
|
||||||
|
* await db.select().from(cars).orderBy(desc(cars.year));
|
||||||
|
*
|
||||||
|
* // Select cars ordered by year and price
|
||||||
|
* await db.select().from(cars).orderBy(asc(cars.year), desc(cars.price));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
orderBy(builder: (aliases: this['_']['selection']) => ValueOrArray<PgColumn | SQL | SQL.Aliased>): PgSelectWithout<this, TDynamic, 'orderBy'>;
|
||||||
|
orderBy(...columns: (PgColumn | SQL | SQL.Aliased)[]): PgSelectWithout<this, TDynamic, 'orderBy'>;
|
||||||
|
/**
|
||||||
|
* Adds a `limit` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will set the maximum number of rows that will be returned by this query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#limit--offset}
|
||||||
|
*
|
||||||
|
* @param limit the `limit` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Get the first 10 people from this query.
|
||||||
|
* await db.select().from(people).limit(10);
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
limit(limit: number | Placeholder): PgSelectWithout<this, TDynamic, 'limit'>;
|
||||||
|
/**
|
||||||
|
* Adds an `offset` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will skip a number of rows when returning results from this query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#limit--offset}
|
||||||
|
*
|
||||||
|
* @param offset the `offset` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Get the 10th-20th people from this query.
|
||||||
|
* await db.select().from(people).offset(10).limit(10);
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
offset(offset: number | Placeholder): PgSelectWithout<this, TDynamic, 'offset'>;
|
||||||
|
/**
|
||||||
|
* Adds a `for` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will specify a lock strength for this query that controls how strictly it acquires exclusive access to the rows being queried.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://www.postgresql.org/docs/current/sql-select.html#SQL-FOR-UPDATE-SHARE}
|
||||||
|
*
|
||||||
|
* @param strength the lock strength.
|
||||||
|
* @param config the lock configuration.
|
||||||
|
*/
|
||||||
|
for(strength: LockStrength, config?: LockConfig): PgSelectWithout<this, TDynamic, 'for'>;
|
||||||
|
toSQL(): Query;
|
||||||
|
as<TAlias extends string>(alias: TAlias): SubqueryWithSelection<this['_']['selectedFields'], TAlias>;
|
||||||
|
$dynamic(): PgSelectDynamic<this>;
|
||||||
|
$withCache(config?: {
|
||||||
|
config?: CacheConfig;
|
||||||
|
tag?: string;
|
||||||
|
autoInvalidate?: boolean;
|
||||||
|
} | false): this;
|
||||||
|
}
|
||||||
|
export interface PgSelectBase<TTableName extends string | undefined, TSelection extends ColumnsSelection, TSelectMode extends SelectMode, TNullabilityMap extends Record<string, JoinNullability> = TTableName extends string ? Record<TTableName, 'not-null'> : {}, TDynamic extends boolean = false, TExcludedMethods extends string = never, TResult extends any[] = SelectResult<TSelection, TSelectMode, TNullabilityMap>[], TSelectedFields extends ColumnsSelection = BuildSubquerySelection<TSelection, TNullabilityMap>> extends PgSelectQueryBuilderBase<PgSelectHKT, TTableName, TSelection, TSelectMode, TNullabilityMap, TDynamic, TExcludedMethods, TResult, TSelectedFields>, QueryPromise<TResult>, SQLWrapper {
|
||||||
|
}
|
||||||
|
export declare class PgSelectBase<TTableName extends string | undefined, TSelection extends ColumnsSelection, TSelectMode extends SelectMode, TNullabilityMap extends Record<string, JoinNullability> = TTableName extends string ? Record<TTableName, 'not-null'> : {}, TDynamic extends boolean = false, TExcludedMethods extends string = never, TResult = SelectResult<TSelection, TSelectMode, TNullabilityMap>[], TSelectedFields = BuildSubquerySelection<TSelection, TNullabilityMap>> extends PgSelectQueryBuilderBase<PgSelectHKT, TTableName, TSelection, TSelectMode, TNullabilityMap, TDynamic, TExcludedMethods, TResult, TSelectedFields> implements RunnableQuery<TResult, 'pg'>, SQLWrapper {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
/**
|
||||||
|
* Create a prepared statement for this query. This allows
|
||||||
|
* the database to remember this query for the given session
|
||||||
|
* and call it by name, rather than specifying the full query.
|
||||||
|
*
|
||||||
|
* {@link https://www.postgresql.org/docs/current/sql-prepare.html | Postgres prepare documentation}
|
||||||
|
*/
|
||||||
|
prepare(name: string): PgSelectPrepare<this>;
|
||||||
|
private authToken?;
|
||||||
|
execute: ReturnType<this['prepare']>['execute'];
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds `union` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result sets of the `select` statements and remove any duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all unique names from customers and users tables
|
||||||
|
* import { union } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await union(
|
||||||
|
* db.select({ name: users.name }).from(users),
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({ name: users.name })
|
||||||
|
* .from(users)
|
||||||
|
* .union(
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const union: PgCreateSetOperatorFn;
|
||||||
|
/**
|
||||||
|
* Adds `union all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result-set of the `select` statements and keep all duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all transaction ids from both online and in-store sales
|
||||||
|
* import { unionAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await unionAll(
|
||||||
|
* db.select({ transaction: onlineSales.transactionId }).from(onlineSales),
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({ transaction: onlineSales.transactionId })
|
||||||
|
* .from(onlineSales)
|
||||||
|
* .unionAll(
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const unionAll: PgCreateSetOperatorFn;
|
||||||
|
/**
|
||||||
|
* Adds `intersect` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retain only the rows that are present in both result sets and eliminate duplicates.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#intersect}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select course names that are offered in both departments A and B
|
||||||
|
* import { intersect } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await intersect(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .intersect(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const intersect: PgCreateSetOperatorFn;
|
||||||
|
/**
|
||||||
|
* Adds `intersect all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retain only the rows that are present in both result sets including all duplicates.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#intersect-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all products and quantities that are ordered by both regular and VIP customers
|
||||||
|
* import { intersectAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await intersectAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders),
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders)
|
||||||
|
* .intersectAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const intersectAll: PgCreateSetOperatorFn;
|
||||||
|
/**
|
||||||
|
* Adds `except` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retrieve all unique rows from the left query, except for the rows that are present in the result set of the right query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#except}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all courses offered in department A but not in department B
|
||||||
|
* import { except } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await except(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .except(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const except: PgCreateSetOperatorFn;
|
||||||
|
/**
|
||||||
|
* Adds `except all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retrieve all rows from the left query, except for the rows that are present in the result set of the right query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#except-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all products that are ordered by regular customers but not by VIP customers
|
||||||
|
* import { exceptAll } from 'drizzle-orm/pg-core'
|
||||||
|
*
|
||||||
|
* await exceptAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders),
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* await db.select({
|
||||||
|
* productId: regularCustomerOrders.productId,
|
||||||
|
* quantityOrdered: regularCustomerOrders.quantityOrdered,
|
||||||
|
* })
|
||||||
|
* .from(regularCustomerOrders)
|
||||||
|
* .exceptAll(
|
||||||
|
* db.select({
|
||||||
|
* productId: vipCustomerOrders.productId,
|
||||||
|
* quantityOrdered: vipCustomerOrders.quantityOrdered,
|
||||||
|
* })
|
||||||
|
* .from(vipCustomerOrders)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const exceptAll: PgCreateSetOperatorFn;
|
||||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,149 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var indexes_exports = {};
|
||||||
|
__export(indexes_exports, {
|
||||||
|
Index: () => Index,
|
||||||
|
IndexBuilder: () => IndexBuilder,
|
||||||
|
IndexBuilderOn: () => IndexBuilderOn,
|
||||||
|
index: () => index,
|
||||||
|
uniqueIndex: () => uniqueIndex
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(indexes_exports);
|
||||||
|
var import_sql = require("../sql/sql.cjs");
|
||||||
|
var import_entity = require("../entity.cjs");
|
||||||
|
var import_columns = require("./columns/index.cjs");
|
||||||
|
class IndexBuilderOn {
|
||||||
|
constructor(unique, name) {
|
||||||
|
this.unique = unique;
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
static [import_entity.entityKind] = "GelIndexBuilderOn";
|
||||||
|
on(...columns) {
|
||||||
|
return new IndexBuilder(
|
||||||
|
columns.map((it) => {
|
||||||
|
if ((0, import_entity.is)(it, import_sql.SQL)) {
|
||||||
|
return it;
|
||||||
|
}
|
||||||
|
it = it;
|
||||||
|
const clonedIndexedColumn = new import_columns.IndexedColumn(it.name, !!it.keyAsName, it.columnType, it.indexConfig);
|
||||||
|
it.indexConfig = JSON.parse(JSON.stringify(it.defaultConfig));
|
||||||
|
return clonedIndexedColumn;
|
||||||
|
}),
|
||||||
|
this.unique,
|
||||||
|
false,
|
||||||
|
this.name
|
||||||
|
);
|
||||||
|
}
|
||||||
|
onOnly(...columns) {
|
||||||
|
return new IndexBuilder(
|
||||||
|
columns.map((it) => {
|
||||||
|
if ((0, import_entity.is)(it, import_sql.SQL)) {
|
||||||
|
return it;
|
||||||
|
}
|
||||||
|
it = it;
|
||||||
|
const clonedIndexedColumn = new import_columns.IndexedColumn(it.name, !!it.keyAsName, it.columnType, it.indexConfig);
|
||||||
|
it.indexConfig = it.defaultConfig;
|
||||||
|
return clonedIndexedColumn;
|
||||||
|
}),
|
||||||
|
this.unique,
|
||||||
|
true,
|
||||||
|
this.name
|
||||||
|
);
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Specify what index method to use. Choices are `btree`, `hash`, `gist`, `sGelist`, `gin`, `brin`, or user-installed access methods like `bloom`. The default method is `btree.
|
||||||
|
*
|
||||||
|
* If you have the `Gel_vector` extension installed in your database, you can use the `hnsw` and `ivfflat` options, which are predefined types.
|
||||||
|
*
|
||||||
|
* **You can always specify any string you want in the method, in case Drizzle doesn't have it natively in its types**
|
||||||
|
*
|
||||||
|
* @param method The name of the index method to be used
|
||||||
|
* @param columns
|
||||||
|
* @returns
|
||||||
|
*/
|
||||||
|
using(method, ...columns) {
|
||||||
|
return new IndexBuilder(
|
||||||
|
columns.map((it) => {
|
||||||
|
if ((0, import_entity.is)(it, import_sql.SQL)) {
|
||||||
|
return it;
|
||||||
|
}
|
||||||
|
it = it;
|
||||||
|
const clonedIndexedColumn = new import_columns.IndexedColumn(it.name, !!it.keyAsName, it.columnType, it.indexConfig);
|
||||||
|
it.indexConfig = JSON.parse(JSON.stringify(it.defaultConfig));
|
||||||
|
return clonedIndexedColumn;
|
||||||
|
}),
|
||||||
|
this.unique,
|
||||||
|
true,
|
||||||
|
this.name,
|
||||||
|
method
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class IndexBuilder {
|
||||||
|
static [import_entity.entityKind] = "GelIndexBuilder";
|
||||||
|
/** @internal */
|
||||||
|
config;
|
||||||
|
constructor(columns, unique, only, name, method = "btree") {
|
||||||
|
this.config = {
|
||||||
|
name,
|
||||||
|
columns,
|
||||||
|
unique,
|
||||||
|
only,
|
||||||
|
method
|
||||||
|
};
|
||||||
|
}
|
||||||
|
concurrently() {
|
||||||
|
this.config.concurrently = true;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
with(obj) {
|
||||||
|
this.config.with = obj;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
where(condition) {
|
||||||
|
this.config.where = condition;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/** @internal */
|
||||||
|
build(table) {
|
||||||
|
return new Index(this.config, table);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class Index {
|
||||||
|
static [import_entity.entityKind] = "GelIndex";
|
||||||
|
config;
|
||||||
|
constructor(config, table) {
|
||||||
|
this.config = { ...config, table };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function index(name) {
|
||||||
|
return new IndexBuilderOn(false, name);
|
||||||
|
}
|
||||||
|
function uniqueIndex(name) {
|
||||||
|
return new IndexBuilderOn(true, name);
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
Index,
|
||||||
|
IndexBuilder,
|
||||||
|
IndexBuilderOn,
|
||||||
|
index,
|
||||||
|
uniqueIndex
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=indexes.cjs.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/singlestore-core/columns/year.ts"],"sourcesContent":["import type { ColumnBuilderBaseConfig, ColumnBuilderRuntimeConfig, MakeColumnConfig } from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnySingleStoreTable } from '~/singlestore-core/table.ts';\nimport { SingleStoreColumn, SingleStoreColumnBuilder } from './common.ts';\n\nexport type SingleStoreYearBuilderInitial<TName extends string> = SingleStoreYearBuilder<{\n\tname: TName;\n\tdataType: 'number';\n\tcolumnType: 'SingleStoreYear';\n\tdata: number;\n\tdriverParam: number;\n\tenumValues: undefined;\n\tgenerated: undefined;\n}>;\n\nexport class SingleStoreYearBuilder<T extends ColumnBuilderBaseConfig<'number', 'SingleStoreYear'>>\n\textends SingleStoreColumnBuilder<T>\n{\n\tstatic override readonly [entityKind]: string = 'SingleStoreYearBuilder';\n\n\tconstructor(name: T['name']) {\n\t\tsuper(name, 'number', 'SingleStoreYear');\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnySingleStoreTable<{ name: TTableName }>,\n\t): SingleStoreYear<MakeColumnConfig<T, TTableName>> {\n\t\treturn new SingleStoreYear<MakeColumnConfig<T, TTableName>>(\n\t\t\ttable,\n\t\t\tthis.config as ColumnBuilderRuntimeConfig<any, any>,\n\t\t);\n\t}\n}\n\nexport class SingleStoreYear<\n\tT extends ColumnBaseConfig<'number', 'SingleStoreYear'>,\n> extends SingleStoreColumn<T> {\n\tstatic override readonly [entityKind]: string = 'SingleStoreYear';\n\n\tgetSQLType(): string {\n\t\treturn `year`;\n\t}\n}\n\nexport function year(): SingleStoreYearBuilderInitial<''>;\nexport function year<TName extends string>(name: TName): SingleStoreYearBuilderInitial<TName>;\nexport function year(name?: string) {\n\treturn new SingleStoreYearBuilder(name ?? '');\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAEA,oBAA2B;AAE3B,oBAA4D;AAYrD,MAAM,+BACJ,uCACT;AAAA,EACC,QAA0B,wBAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB;AAC5B,UAAM,MAAM,UAAU,iBAAiB;AAAA,EACxC;AAAA;AAAA,EAGS,MACR,OACmD;AACnD,WAAO,IAAI;AAAA,MACV;AAAA,MACA,KAAK;AAAA,IACN;AAAA,EACD;AACD;AAEO,MAAM,wBAEH,gCAAqB;AAAA,EAC9B,QAA0B,wBAAU,IAAY;AAAA,EAEhD,aAAqB;AACpB,WAAO;AAAA,EACR;AACD;AAIO,SAAS,KAAK,MAAe;AACnC,SAAO,IAAI,uBAAuB,QAAQ,EAAE;AAC7C;","names":[]}
|
||||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"concatMap.js","sourceRoot":"","sources":["../../../../src/internal/operators/concatMap.ts"],"names":[],"mappings":";;;AAAA,uCAAsC;AAEtC,iDAAgD;AA2EhD,SAAgB,SAAS,CACvB,OAAuC,EACvC,cAA6G;IAE7G,OAAO,uBAAU,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,mBAAQ,CAAC,OAAO,EAAE,cAAc,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC,mBAAQ,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC;AAClG,CAAC;AALD,8BAKC"}
|
||||||
@ -0,0 +1,5 @@
|
|||||||
|
'use strict'
|
||||||
|
|
||||||
|
module.exports.isClean = Symbol('isClean')
|
||||||
|
|
||||||
|
module.exports.my = Symbol('my')
|
||||||
@ -0,0 +1,34 @@
|
|||||||
|
var defer = require('./defer.js');
|
||||||
|
|
||||||
|
// API
|
||||||
|
module.exports = async;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Runs provided callback asynchronously
|
||||||
|
* even if callback itself is not
|
||||||
|
*
|
||||||
|
* @param {function} callback - callback to invoke
|
||||||
|
* @returns {function} - augmented callback
|
||||||
|
*/
|
||||||
|
function async(callback)
|
||||||
|
{
|
||||||
|
var isAsync = false;
|
||||||
|
|
||||||
|
// check if async happened
|
||||||
|
defer(function() { isAsync = true; });
|
||||||
|
|
||||||
|
return function async_callback(err, result)
|
||||||
|
{
|
||||||
|
if (isAsync)
|
||||||
|
{
|
||||||
|
callback(err, result);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
defer(function nextTick_callback()
|
||||||
|
{
|
||||||
|
callback(err, result);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/gel-core/columns/localdate.ts"],"sourcesContent":["import type { LocalDate } from 'gel';\nimport type { ColumnBuilderBaseConfig, ColumnBuilderRuntimeConfig, MakeColumnConfig } from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnyGelTable } from '~/gel-core/table.ts';\nimport { GelColumn } from './common.ts';\nimport { GelLocalDateColumnBaseBuilder } from './date.common.ts';\n\nexport type GelLocalDateStringBuilderInitial<TName extends string> = GelLocalDateStringBuilder<{\n\tname: TName;\n\tdataType: 'localDate';\n\tcolumnType: 'GelLocalDateString';\n\tdata: LocalDate;\n\tdriverParam: LocalDate;\n\tenumValues: undefined;\n}>;\n\nexport class GelLocalDateStringBuilder<T extends ColumnBuilderBaseConfig<'localDate', 'GelLocalDateString'>>\n\textends GelLocalDateColumnBaseBuilder<T>\n{\n\tstatic override readonly [entityKind]: string = 'GelLocalDateStringBuilder';\n\n\tconstructor(name: T['name']) {\n\t\tsuper(name, 'localDate', 'GelLocalDateString');\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnyGelTable<{ name: TTableName }>,\n\t): GelLocalDateString<MakeColumnConfig<T, TTableName>> {\n\t\treturn new GelLocalDateString<MakeColumnConfig<T, TTableName>>(\n\t\t\ttable,\n\t\t\tthis.config as ColumnBuilderRuntimeConfig<any, any>,\n\t\t);\n\t}\n}\n\nexport class GelLocalDateString<T extends ColumnBaseConfig<'localDate', 'GelLocalDateString'>> extends GelColumn<T> {\n\tstatic override readonly [entityKind]: string = 'GelLocalDateString';\n\n\tgetSQLType(): string {\n\t\treturn 'cal::local_date';\n\t}\n}\n\nexport function localDate(): GelLocalDateStringBuilderInitial<''>;\nexport function localDate<TName extends string>(name: TName): GelLocalDateStringBuilderInitial<TName>;\nexport function localDate(name?: string) {\n\treturn new GelLocalDateStringBuilder(name ?? '');\n}\n"],"mappings":"AAGA,SAAS,kBAAkB;AAE3B,SAAS,iBAAiB;AAC1B,SAAS,qCAAqC;AAWvC,MAAM,kCACJ,8BACT;AAAA,EACC,QAA0B,UAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB;AAC5B,UAAM,MAAM,aAAa,oBAAoB;AAAA,EAC9C;AAAA;AAAA,EAGS,MACR,OACsD;AACtD,WAAO,IAAI;AAAA,MACV;AAAA,MACA,KAAK;AAAA,IACN;AAAA,EACD;AACD;AAEO,MAAM,2BAA0F,UAAa;AAAA,EACnH,QAA0B,UAAU,IAAY;AAAA,EAEhD,aAAqB;AACpB,WAAO;AAAA,EACR;AACD;AAIO,SAAS,UAAU,MAAe;AACxC,SAAO,IAAI,0BAA0B,QAAQ,EAAE;AAChD;","names":[]}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"refCount.js","sourceRoot":"","sources":["../../../../src/internal/operators/refCount.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,OAAO,EAAE,MAAM,cAAc,CAAC;AACvC,OAAO,EAAE,wBAAwB,EAAE,MAAM,sBAAsB,CAAC;AA4DhE,MAAM,UAAU,QAAQ;IACtB,OAAO,OAAO,CAAC,UAAC,MAAM,EAAE,UAAU;QAChC,IAAI,UAAU,GAAwB,IAAI,CAAC;QAE1C,MAAc,CAAC,SAAS,EAAE,CAAC;QAE5B,IAAM,UAAU,GAAG,wBAAwB,CAAC,UAAU,EAAE,SAAS,EAAE,SAAS,EAAE,SAAS,EAAE;YACvF,IAAI,CAAC,MAAM,IAAK,MAAc,CAAC,SAAS,IAAI,CAAC,IAAI,CAAC,GAAG,EAAG,MAAc,CAAC,SAAS,EAAE;gBAChF,UAAU,GAAG,IAAI,CAAC;gBAClB,OAAO;aACR;YA2BD,IAAM,gBAAgB,GAAI,MAAc,CAAC,WAAW,CAAC;YACrD,IAAM,IAAI,GAAG,UAAU,CAAC;YACxB,UAAU,GAAG,IAAI,CAAC;YAElB,IAAI,gBAAgB,IAAI,CAAC,CAAC,IAAI,IAAI,gBAAgB,KAAK,IAAI,CAAC,EAAE;gBAC5D,gBAAgB,CAAC,WAAW,EAAE,CAAC;aAChC;YAED,UAAU,CAAC,WAAW,EAAE,CAAC;QAC3B,CAAC,CAAC,CAAC;QAEH,MAAM,CAAC,SAAS,CAAC,UAAU,CAAC,CAAC;QAE7B,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE;YACtB,UAAU,GAAI,MAAmC,CAAC,OAAO,EAAE,CAAC;SAC7D;IACH,CAAC,CAAC,CAAC;AACL,CAAC"}
|
||||||
@ -0,0 +1,138 @@
|
|||||||
|
# combined-stream
|
||||||
|
|
||||||
|
A stream that emits multiple other streams one after another.
|
||||||
|
|
||||||
|
**NB** Currently `combined-stream` works with streams version 1 only. There is ongoing effort to switch this library to streams version 2. Any help is welcome. :) Meanwhile you can explore other libraries that provide streams2 support with more or less compatibility with `combined-stream`.
|
||||||
|
|
||||||
|
- [combined-stream2](https://www.npmjs.com/package/combined-stream2): A drop-in streams2-compatible replacement for the combined-stream module.
|
||||||
|
|
||||||
|
- [multistream](https://www.npmjs.com/package/multistream): A stream that emits multiple other streams one after another.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
npm install combined-stream
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
Here is a simple example that shows how you can use combined-stream to combine
|
||||||
|
two files into one:
|
||||||
|
|
||||||
|
``` javascript
|
||||||
|
var CombinedStream = require('combined-stream');
|
||||||
|
var fs = require('fs');
|
||||||
|
|
||||||
|
var combinedStream = CombinedStream.create();
|
||||||
|
combinedStream.append(fs.createReadStream('file1.txt'));
|
||||||
|
combinedStream.append(fs.createReadStream('file2.txt'));
|
||||||
|
|
||||||
|
combinedStream.pipe(fs.createWriteStream('combined.txt'));
|
||||||
|
```
|
||||||
|
|
||||||
|
While the example above works great, it will pause all source streams until
|
||||||
|
they are needed. If you don't want that to happen, you can set `pauseStreams`
|
||||||
|
to `false`:
|
||||||
|
|
||||||
|
``` javascript
|
||||||
|
var CombinedStream = require('combined-stream');
|
||||||
|
var fs = require('fs');
|
||||||
|
|
||||||
|
var combinedStream = CombinedStream.create({pauseStreams: false});
|
||||||
|
combinedStream.append(fs.createReadStream('file1.txt'));
|
||||||
|
combinedStream.append(fs.createReadStream('file2.txt'));
|
||||||
|
|
||||||
|
combinedStream.pipe(fs.createWriteStream('combined.txt'));
|
||||||
|
```
|
||||||
|
|
||||||
|
However, what if you don't have all the source streams yet, or you don't want
|
||||||
|
to allocate the resources (file descriptors, memory, etc.) for them right away?
|
||||||
|
Well, in that case you can simply provide a callback that supplies the stream
|
||||||
|
by calling a `next()` function:
|
||||||
|
|
||||||
|
``` javascript
|
||||||
|
var CombinedStream = require('combined-stream');
|
||||||
|
var fs = require('fs');
|
||||||
|
|
||||||
|
var combinedStream = CombinedStream.create();
|
||||||
|
combinedStream.append(function(next) {
|
||||||
|
next(fs.createReadStream('file1.txt'));
|
||||||
|
});
|
||||||
|
combinedStream.append(function(next) {
|
||||||
|
next(fs.createReadStream('file2.txt'));
|
||||||
|
});
|
||||||
|
|
||||||
|
combinedStream.pipe(fs.createWriteStream('combined.txt'));
|
||||||
|
```
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
### CombinedStream.create([options])
|
||||||
|
|
||||||
|
Returns a new combined stream object. Available options are:
|
||||||
|
|
||||||
|
* `maxDataSize`
|
||||||
|
* `pauseStreams`
|
||||||
|
|
||||||
|
The effect of those options is described below.
|
||||||
|
|
||||||
|
### combinedStream.pauseStreams = `true`
|
||||||
|
|
||||||
|
Whether to apply back pressure to the underlaying streams. If set to `false`,
|
||||||
|
the underlaying streams will never be paused. If set to `true`, the
|
||||||
|
underlaying streams will be paused right after being appended, as well as when
|
||||||
|
`delayedStream.pipe()` wants to throttle.
|
||||||
|
|
||||||
|
### combinedStream.maxDataSize = `2 * 1024 * 1024`
|
||||||
|
|
||||||
|
The maximum amount of bytes (or characters) to buffer for all source streams.
|
||||||
|
If this value is exceeded, `combinedStream` emits an `'error'` event.
|
||||||
|
|
||||||
|
### combinedStream.dataSize = `0`
|
||||||
|
|
||||||
|
The amount of bytes (or characters) currently buffered by `combinedStream`.
|
||||||
|
|
||||||
|
### combinedStream.append(stream)
|
||||||
|
|
||||||
|
Appends the given `stream` to the combinedStream object. If `pauseStreams` is
|
||||||
|
set to `true, this stream will also be paused right away.
|
||||||
|
|
||||||
|
`streams` can also be a function that takes one parameter called `next`. `next`
|
||||||
|
is a function that must be invoked in order to provide the `next` stream, see
|
||||||
|
example above.
|
||||||
|
|
||||||
|
Regardless of how the `stream` is appended, combined-stream always attaches an
|
||||||
|
`'error'` listener to it, so you don't have to do that manually.
|
||||||
|
|
||||||
|
Special case: `stream` can also be a String or Buffer.
|
||||||
|
|
||||||
|
### combinedStream.write(data)
|
||||||
|
|
||||||
|
You should not call this, `combinedStream` takes care of piping the appended
|
||||||
|
streams into itself for you.
|
||||||
|
|
||||||
|
### combinedStream.resume()
|
||||||
|
|
||||||
|
Causes `combinedStream` to start drain the streams it manages. The function is
|
||||||
|
idempotent, and also emits a `'resume'` event each time which usually goes to
|
||||||
|
the stream that is currently being drained.
|
||||||
|
|
||||||
|
### combinedStream.pause();
|
||||||
|
|
||||||
|
If `combinedStream.pauseStreams` is set to `false`, this does nothing.
|
||||||
|
Otherwise a `'pause'` event is emitted, this goes to the stream that is
|
||||||
|
currently being drained, so you can use it to apply back pressure.
|
||||||
|
|
||||||
|
### combinedStream.end();
|
||||||
|
|
||||||
|
Sets `combinedStream.writable` to false, emits an `'end'` event, and removes
|
||||||
|
all streams from the queue.
|
||||||
|
|
||||||
|
### combinedStream.destroy();
|
||||||
|
|
||||||
|
Same as `combinedStream.end()`, except it emits a `'close'` event instead of
|
||||||
|
`'end'`.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
combined-stream is licensed under the MIT license.
|
||||||
@ -0,0 +1,14 @@
|
|||||||
|
import { AsyncAction } from './AsyncAction';
|
||||||
|
import { Subscription } from '../Subscription';
|
||||||
|
import { QueueScheduler } from './QueueScheduler';
|
||||||
|
import { SchedulerAction } from '../types';
|
||||||
|
import { TimerHandle } from './timerHandle';
|
||||||
|
export declare class QueueAction<T> extends AsyncAction<T> {
|
||||||
|
protected scheduler: QueueScheduler;
|
||||||
|
protected work: (this: SchedulerAction<T>, state?: T) => void;
|
||||||
|
constructor(scheduler: QueueScheduler, work: (this: SchedulerAction<T>, state?: T) => void);
|
||||||
|
schedule(state?: T, delay?: number): Subscription;
|
||||||
|
execute(state: T, delay: number): any;
|
||||||
|
protected requestAsyncId(scheduler: QueueScheduler, id?: TimerHandle, delay?: number): TimerHandle;
|
||||||
|
}
|
||||||
|
//# sourceMappingURL=QueueAction.d.ts.map
|
||||||
@ -0,0 +1,714 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except2, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except2)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var select_exports = {};
|
||||||
|
__export(select_exports, {
|
||||||
|
SQLiteSelectBase: () => SQLiteSelectBase,
|
||||||
|
SQLiteSelectBuilder: () => SQLiteSelectBuilder,
|
||||||
|
SQLiteSelectQueryBuilderBase: () => SQLiteSelectQueryBuilderBase,
|
||||||
|
except: () => except,
|
||||||
|
intersect: () => intersect,
|
||||||
|
union: () => union,
|
||||||
|
unionAll: () => unionAll
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(select_exports);
|
||||||
|
var import_entity = require("../../entity.cjs");
|
||||||
|
var import_query_builder = require("../../query-builders/query-builder.cjs");
|
||||||
|
var import_query_promise = require("../../query-promise.cjs");
|
||||||
|
var import_selection_proxy = require("../../selection-proxy.cjs");
|
||||||
|
var import_sql = require("../../sql/sql.cjs");
|
||||||
|
var import_subquery = require("../../subquery.cjs");
|
||||||
|
var import_table = require("../../table.cjs");
|
||||||
|
var import_utils = require("../../utils.cjs");
|
||||||
|
var import_view_common = require("../../view-common.cjs");
|
||||||
|
var import_utils2 = require("../utils.cjs");
|
||||||
|
var import_view_base = require("../view-base.cjs");
|
||||||
|
class SQLiteSelectBuilder {
|
||||||
|
static [import_entity.entityKind] = "SQLiteSelectBuilder";
|
||||||
|
fields;
|
||||||
|
session;
|
||||||
|
dialect;
|
||||||
|
withList;
|
||||||
|
distinct;
|
||||||
|
constructor(config) {
|
||||||
|
this.fields = config.fields;
|
||||||
|
this.session = config.session;
|
||||||
|
this.dialect = config.dialect;
|
||||||
|
this.withList = config.withList;
|
||||||
|
this.distinct = config.distinct;
|
||||||
|
}
|
||||||
|
from(source) {
|
||||||
|
const isPartialSelect = !!this.fields;
|
||||||
|
let fields;
|
||||||
|
if (this.fields) {
|
||||||
|
fields = this.fields;
|
||||||
|
} else if ((0, import_entity.is)(source, import_subquery.Subquery)) {
|
||||||
|
fields = Object.fromEntries(
|
||||||
|
Object.keys(source._.selectedFields).map((key) => [key, source[key]])
|
||||||
|
);
|
||||||
|
} else if ((0, import_entity.is)(source, import_view_base.SQLiteViewBase)) {
|
||||||
|
fields = source[import_view_common.ViewBaseConfig].selectedFields;
|
||||||
|
} else if ((0, import_entity.is)(source, import_sql.SQL)) {
|
||||||
|
fields = {};
|
||||||
|
} else {
|
||||||
|
fields = (0, import_utils.getTableColumns)(source);
|
||||||
|
}
|
||||||
|
return new SQLiteSelectBase({
|
||||||
|
table: source,
|
||||||
|
fields,
|
||||||
|
isPartialSelect,
|
||||||
|
session: this.session,
|
||||||
|
dialect: this.dialect,
|
||||||
|
withList: this.withList,
|
||||||
|
distinct: this.distinct
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class SQLiteSelectQueryBuilderBase extends import_query_builder.TypedQueryBuilder {
|
||||||
|
static [import_entity.entityKind] = "SQLiteSelectQueryBuilder";
|
||||||
|
_;
|
||||||
|
/** @internal */
|
||||||
|
config;
|
||||||
|
joinsNotNullableMap;
|
||||||
|
tableName;
|
||||||
|
isPartialSelect;
|
||||||
|
session;
|
||||||
|
dialect;
|
||||||
|
cacheConfig = void 0;
|
||||||
|
usedTables = /* @__PURE__ */ new Set();
|
||||||
|
constructor({ table, fields, isPartialSelect, session, dialect, withList, distinct }) {
|
||||||
|
super();
|
||||||
|
this.config = {
|
||||||
|
withList,
|
||||||
|
table,
|
||||||
|
fields: { ...fields },
|
||||||
|
distinct,
|
||||||
|
setOperators: []
|
||||||
|
};
|
||||||
|
this.isPartialSelect = isPartialSelect;
|
||||||
|
this.session = session;
|
||||||
|
this.dialect = dialect;
|
||||||
|
this._ = {
|
||||||
|
selectedFields: fields,
|
||||||
|
config: this.config
|
||||||
|
};
|
||||||
|
this.tableName = (0, import_utils.getTableLikeName)(table);
|
||||||
|
this.joinsNotNullableMap = typeof this.tableName === "string" ? { [this.tableName]: true } : {};
|
||||||
|
for (const item of (0, import_utils2.extractUsedTable)(table)) this.usedTables.add(item);
|
||||||
|
}
|
||||||
|
/** @internal */
|
||||||
|
getUsedTables() {
|
||||||
|
return [...this.usedTables];
|
||||||
|
}
|
||||||
|
createJoin(joinType) {
|
||||||
|
return (table, on) => {
|
||||||
|
const baseTableName = this.tableName;
|
||||||
|
const tableName = (0, import_utils.getTableLikeName)(table);
|
||||||
|
for (const item of (0, import_utils2.extractUsedTable)(table)) this.usedTables.add(item);
|
||||||
|
if (typeof tableName === "string" && this.config.joins?.some((join) => join.alias === tableName)) {
|
||||||
|
throw new Error(`Alias "${tableName}" is already used in this query`);
|
||||||
|
}
|
||||||
|
if (!this.isPartialSelect) {
|
||||||
|
if (Object.keys(this.joinsNotNullableMap).length === 1 && typeof baseTableName === "string") {
|
||||||
|
this.config.fields = {
|
||||||
|
[baseTableName]: this.config.fields
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (typeof tableName === "string" && !(0, import_entity.is)(table, import_sql.SQL)) {
|
||||||
|
const selection = (0, import_entity.is)(table, import_subquery.Subquery) ? table._.selectedFields : (0, import_entity.is)(table, import_sql.View) ? table[import_view_common.ViewBaseConfig].selectedFields : table[import_table.Table.Symbol.Columns];
|
||||||
|
this.config.fields[tableName] = selection;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (typeof on === "function") {
|
||||||
|
on = on(
|
||||||
|
new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ sqlAliasedBehavior: "sql", sqlBehavior: "sql" })
|
||||||
|
)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
if (!this.config.joins) {
|
||||||
|
this.config.joins = [];
|
||||||
|
}
|
||||||
|
this.config.joins.push({ on, table, joinType, alias: tableName });
|
||||||
|
if (typeof tableName === "string") {
|
||||||
|
switch (joinType) {
|
||||||
|
case "left": {
|
||||||
|
this.joinsNotNullableMap[tableName] = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case "right": {
|
||||||
|
this.joinsNotNullableMap = Object.fromEntries(
|
||||||
|
Object.entries(this.joinsNotNullableMap).map(([key]) => [key, false])
|
||||||
|
);
|
||||||
|
this.joinsNotNullableMap[tableName] = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case "cross":
|
||||||
|
case "inner": {
|
||||||
|
this.joinsNotNullableMap[tableName] = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case "full": {
|
||||||
|
this.joinsNotNullableMap = Object.fromEntries(
|
||||||
|
Object.entries(this.joinsNotNullableMap).map(([key]) => [key, false])
|
||||||
|
);
|
||||||
|
this.joinsNotNullableMap[tableName] = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Executes a `left join` operation by adding another table to the current query.
|
||||||
|
*
|
||||||
|
* Calling this method associates each row of the table with the corresponding row from the joined table, if a match is found. If no matching row exists, it sets all columns of the joined table to null.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#left-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User; pets: Pet | null; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .leftJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number | null; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .leftJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
leftJoin = this.createJoin("left");
|
||||||
|
/**
|
||||||
|
* Executes a `right join` operation by adding another table to the current query.
|
||||||
|
*
|
||||||
|
* Calling this method associates each row of the joined table with the corresponding row from the main table, if a match is found. If no matching row exists, it sets all columns of the main table to null.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#right-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User | null; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .rightJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number | null; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .rightJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
rightJoin = this.createJoin("right");
|
||||||
|
/**
|
||||||
|
* Executes an `inner join` operation, creating a new table by combining rows from two tables that have matching values.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves rows that have corresponding entries in both joined tables. Rows without matching entries in either table are excluded, resulting in a table that includes only matching pairs.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#inner-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .innerJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .innerJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
innerJoin = this.createJoin("inner");
|
||||||
|
/**
|
||||||
|
* Executes a `full join` operation by combining rows from two tables into a new table.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves all rows from both main and joined tables, merging rows with matching values and filling in `null` for non-matching columns.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#full-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
* @param on the `on` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users and their pets
|
||||||
|
* const usersWithPets: { user: User | null; pets: Pet | null; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .fullJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number | null; petId: number | null; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .fullJoin(pets, eq(users.id, pets.ownerId))
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
fullJoin = this.createJoin("full");
|
||||||
|
/**
|
||||||
|
* Executes a `cross join` operation by combining rows from two tables into a new table.
|
||||||
|
*
|
||||||
|
* Calling this method retrieves all rows from both main and joined tables, merging all rows from each table.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/joins#cross-join}
|
||||||
|
*
|
||||||
|
* @param table the table to join.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all users, each user with every pet
|
||||||
|
* const usersWithPets: { user: User; pets: Pet; }[] = await db.select()
|
||||||
|
* .from(users)
|
||||||
|
* .crossJoin(pets)
|
||||||
|
*
|
||||||
|
* // Select userId and petId
|
||||||
|
* const usersIdsAndPetIds: { userId: number; petId: number; }[] = await db.select({
|
||||||
|
* userId: users.id,
|
||||||
|
* petId: pets.id,
|
||||||
|
* })
|
||||||
|
* .from(users)
|
||||||
|
* .crossJoin(pets)
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
crossJoin = this.createJoin("cross");
|
||||||
|
createSetOperator(type, isAll) {
|
||||||
|
return (rightSelection) => {
|
||||||
|
const rightSelect = typeof rightSelection === "function" ? rightSelection(getSQLiteSetOperators()) : rightSelection;
|
||||||
|
if (!(0, import_utils.haveSameKeys)(this.getSelectedFields(), rightSelect.getSelectedFields())) {
|
||||||
|
throw new Error(
|
||||||
|
"Set operator error (union / intersect / except): selected fields are not the same or are in a different order"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
this.config.setOperators.push({ type, isAll, rightSelect });
|
||||||
|
return this;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds `union` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result sets of the `select` statements and remove any duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all unique names from customers and users tables
|
||||||
|
* await db.select({ name: users.name })
|
||||||
|
* .from(users)
|
||||||
|
* .union(
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { union } from 'drizzle-orm/sqlite-core'
|
||||||
|
*
|
||||||
|
* await union(
|
||||||
|
* db.select({ name: users.name }).from(users),
|
||||||
|
* db.select({ name: customers.name }).from(customers)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
union = this.createSetOperator("union", false);
|
||||||
|
/**
|
||||||
|
* Adds `union all` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will combine the result-set of the `select` statements and keep all duplicate rows that appear across them.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#union-all}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all transaction ids from both online and in-store sales
|
||||||
|
* await db.select({ transaction: onlineSales.transactionId })
|
||||||
|
* .from(onlineSales)
|
||||||
|
* .unionAll(
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { unionAll } from 'drizzle-orm/sqlite-core'
|
||||||
|
*
|
||||||
|
* await unionAll(
|
||||||
|
* db.select({ transaction: onlineSales.transactionId }).from(onlineSales),
|
||||||
|
* db.select({ transaction: inStoreSales.transactionId }).from(inStoreSales)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
unionAll = this.createSetOperator("union", true);
|
||||||
|
/**
|
||||||
|
* Adds `intersect` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retain only the rows that are present in both result sets and eliminate duplicates.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#intersect}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select course names that are offered in both departments A and B
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .intersect(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { intersect } from 'drizzle-orm/sqlite-core'
|
||||||
|
*
|
||||||
|
* await intersect(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
intersect = this.createSetOperator("intersect", false);
|
||||||
|
/**
|
||||||
|
* Adds `except` set operator to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will retrieve all unique rows from the left query, except for the rows that are present in the result set of the right query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/set-operations#except}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all courses offered in department A but not in department B
|
||||||
|
* await db.select({ courseName: depA.courseName })
|
||||||
|
* .from(depA)
|
||||||
|
* .except(
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* // or
|
||||||
|
* import { except } from 'drizzle-orm/sqlite-core'
|
||||||
|
*
|
||||||
|
* await except(
|
||||||
|
* db.select({ courseName: depA.courseName }).from(depA),
|
||||||
|
* db.select({ courseName: depB.courseName }).from(depB)
|
||||||
|
* );
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
except = this.createSetOperator("except", false);
|
||||||
|
/** @internal */
|
||||||
|
addSetOperators(setOperators) {
|
||||||
|
this.config.setOperators.push(...setOperators);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds a `where` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will select only those rows that fulfill a specified condition.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#filtering}
|
||||||
|
*
|
||||||
|
* @param where the `where` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* You can use conditional operators and `sql function` to filter the rows to be selected.
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all cars with green color
|
||||||
|
* await db.select().from(cars).where(eq(cars.color, 'green'));
|
||||||
|
* // or
|
||||||
|
* await db.select().from(cars).where(sql`${cars.color} = 'green'`)
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* You can logically combine conditional operators with `and()` and `or()` operators:
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all BMW cars with a green color
|
||||||
|
* await db.select().from(cars).where(and(eq(cars.color, 'green'), eq(cars.brand, 'BMW')));
|
||||||
|
*
|
||||||
|
* // Select all cars with the green or blue color
|
||||||
|
* await db.select().from(cars).where(or(eq(cars.color, 'green'), eq(cars.color, 'blue')));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
where(where) {
|
||||||
|
if (typeof where === "function") {
|
||||||
|
where = where(
|
||||||
|
new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ sqlAliasedBehavior: "sql", sqlBehavior: "sql" })
|
||||||
|
)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
this.config.where = where;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds a `having` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will select only those rows that fulfill a specified condition. It is typically used with aggregate functions to filter the aggregated data based on a specified condition.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#aggregations}
|
||||||
|
*
|
||||||
|
* @param having the `having` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Select all brands with more than one car
|
||||||
|
* await db.select({
|
||||||
|
* brand: cars.brand,
|
||||||
|
* count: sql<number>`cast(count(${cars.id}) as int)`,
|
||||||
|
* })
|
||||||
|
* .from(cars)
|
||||||
|
* .groupBy(cars.brand)
|
||||||
|
* .having(({ count }) => gt(count, 1));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
having(having) {
|
||||||
|
if (typeof having === "function") {
|
||||||
|
having = having(
|
||||||
|
new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ sqlAliasedBehavior: "sql", sqlBehavior: "sql" })
|
||||||
|
)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
this.config.having = having;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
groupBy(...columns) {
|
||||||
|
if (typeof columns[0] === "function") {
|
||||||
|
const groupBy = columns[0](
|
||||||
|
new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ sqlAliasedBehavior: "alias", sqlBehavior: "sql" })
|
||||||
|
)
|
||||||
|
);
|
||||||
|
this.config.groupBy = Array.isArray(groupBy) ? groupBy : [groupBy];
|
||||||
|
} else {
|
||||||
|
this.config.groupBy = columns;
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
orderBy(...columns) {
|
||||||
|
if (typeof columns[0] === "function") {
|
||||||
|
const orderBy = columns[0](
|
||||||
|
new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ sqlAliasedBehavior: "alias", sqlBehavior: "sql" })
|
||||||
|
)
|
||||||
|
);
|
||||||
|
const orderByArray = Array.isArray(orderBy) ? orderBy : [orderBy];
|
||||||
|
if (this.config.setOperators.length > 0) {
|
||||||
|
this.config.setOperators.at(-1).orderBy = orderByArray;
|
||||||
|
} else {
|
||||||
|
this.config.orderBy = orderByArray;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const orderByArray = columns;
|
||||||
|
if (this.config.setOperators.length > 0) {
|
||||||
|
this.config.setOperators.at(-1).orderBy = orderByArray;
|
||||||
|
} else {
|
||||||
|
this.config.orderBy = orderByArray;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds a `limit` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will set the maximum number of rows that will be returned by this query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#limit--offset}
|
||||||
|
*
|
||||||
|
* @param limit the `limit` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Get the first 10 people from this query.
|
||||||
|
* await db.select().from(people).limit(10);
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
limit(limit) {
|
||||||
|
if (this.config.setOperators.length > 0) {
|
||||||
|
this.config.setOperators.at(-1).limit = limit;
|
||||||
|
} else {
|
||||||
|
this.config.limit = limit;
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Adds an `offset` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will skip a number of rows when returning results from this query.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/select#limit--offset}
|
||||||
|
*
|
||||||
|
* @param offset the `offset` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Get the 10th-20th people from this query.
|
||||||
|
* await db.select().from(people).offset(10).limit(10);
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
offset(offset) {
|
||||||
|
if (this.config.setOperators.length > 0) {
|
||||||
|
this.config.setOperators.at(-1).offset = offset;
|
||||||
|
} else {
|
||||||
|
this.config.offset = offset;
|
||||||
|
}
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
/** @internal */
|
||||||
|
getSQL() {
|
||||||
|
return this.dialect.buildSelectQuery(this.config);
|
||||||
|
}
|
||||||
|
toSQL() {
|
||||||
|
const { typings: _typings, ...rest } = this.dialect.sqlToQuery(this.getSQL());
|
||||||
|
return rest;
|
||||||
|
}
|
||||||
|
as(alias) {
|
||||||
|
const usedTables = [];
|
||||||
|
usedTables.push(...(0, import_utils2.extractUsedTable)(this.config.table));
|
||||||
|
if (this.config.joins) {
|
||||||
|
for (const it of this.config.joins) usedTables.push(...(0, import_utils2.extractUsedTable)(it.table));
|
||||||
|
}
|
||||||
|
return new Proxy(
|
||||||
|
new import_subquery.Subquery(this.getSQL(), this.config.fields, alias, false, [...new Set(usedTables)]),
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ alias, sqlAliasedBehavior: "alias", sqlBehavior: "error" })
|
||||||
|
);
|
||||||
|
}
|
||||||
|
/** @internal */
|
||||||
|
getSelectedFields() {
|
||||||
|
return new Proxy(
|
||||||
|
this.config.fields,
|
||||||
|
new import_selection_proxy.SelectionProxyHandler({ alias: this.tableName, sqlAliasedBehavior: "alias", sqlBehavior: "error" })
|
||||||
|
);
|
||||||
|
}
|
||||||
|
$dynamic() {
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class SQLiteSelectBase extends SQLiteSelectQueryBuilderBase {
|
||||||
|
static [import_entity.entityKind] = "SQLiteSelect";
|
||||||
|
/** @internal */
|
||||||
|
_prepare(isOneTimeQuery = true) {
|
||||||
|
if (!this.session) {
|
||||||
|
throw new Error("Cannot execute a query on a query builder. Please use a database instance instead.");
|
||||||
|
}
|
||||||
|
const fieldsList = (0, import_utils.orderSelectedFields)(this.config.fields);
|
||||||
|
const query = this.session[isOneTimeQuery ? "prepareOneTimeQuery" : "prepareQuery"](
|
||||||
|
this.dialect.sqlToQuery(this.getSQL()),
|
||||||
|
fieldsList,
|
||||||
|
"all",
|
||||||
|
true,
|
||||||
|
void 0,
|
||||||
|
{
|
||||||
|
type: "select",
|
||||||
|
tables: [...this.usedTables]
|
||||||
|
},
|
||||||
|
this.cacheConfig
|
||||||
|
);
|
||||||
|
query.joinsNotNullableMap = this.joinsNotNullableMap;
|
||||||
|
return query;
|
||||||
|
}
|
||||||
|
$withCache(config) {
|
||||||
|
this.cacheConfig = config === void 0 ? { config: {}, enable: true, autoInvalidate: true } : config === false ? { enable: false } : { enable: true, autoInvalidate: true, ...config };
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
prepare() {
|
||||||
|
return this._prepare(false);
|
||||||
|
}
|
||||||
|
run = (placeholderValues) => {
|
||||||
|
return this._prepare().run(placeholderValues);
|
||||||
|
};
|
||||||
|
all = (placeholderValues) => {
|
||||||
|
return this._prepare().all(placeholderValues);
|
||||||
|
};
|
||||||
|
get = (placeholderValues) => {
|
||||||
|
return this._prepare().get(placeholderValues);
|
||||||
|
};
|
||||||
|
values = (placeholderValues) => {
|
||||||
|
return this._prepare().values(placeholderValues);
|
||||||
|
};
|
||||||
|
async execute() {
|
||||||
|
return this.all();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
(0, import_utils.applyMixins)(SQLiteSelectBase, [import_query_promise.QueryPromise]);
|
||||||
|
function createSetOperator(type, isAll) {
|
||||||
|
return (leftSelect, rightSelect, ...restSelects) => {
|
||||||
|
const setOperators = [rightSelect, ...restSelects].map((select) => ({
|
||||||
|
type,
|
||||||
|
isAll,
|
||||||
|
rightSelect: select
|
||||||
|
}));
|
||||||
|
for (const setOperator of setOperators) {
|
||||||
|
if (!(0, import_utils.haveSameKeys)(leftSelect.getSelectedFields(), setOperator.rightSelect.getSelectedFields())) {
|
||||||
|
throw new Error(
|
||||||
|
"Set operator error (union / intersect / except): selected fields are not the same or are in a different order"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return leftSelect.addSetOperators(setOperators);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const getSQLiteSetOperators = () => ({
|
||||||
|
union,
|
||||||
|
unionAll,
|
||||||
|
intersect,
|
||||||
|
except
|
||||||
|
});
|
||||||
|
const union = createSetOperator("union", false);
|
||||||
|
const unionAll = createSetOperator("union", true);
|
||||||
|
const intersect = createSetOperator("intersect", false);
|
||||||
|
const except = createSetOperator("except", false);
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
SQLiteSelectBase,
|
||||||
|
SQLiteSelectBuilder,
|
||||||
|
SQLiteSelectQueryBuilderBase,
|
||||||
|
except,
|
||||||
|
intersect,
|
||||||
|
union,
|
||||||
|
unionAll
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=select.cjs.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"innerFrom.js","sourceRoot":"","sources":["../../../../src/internal/observable/innerFrom.ts"],"names":[],"mappings":";AAAA,OAAO,EAAE,WAAW,EAAE,MAAM,qBAAqB,CAAC;AAClD,OAAO,EAAE,SAAS,EAAE,MAAM,mBAAmB,CAAC;AAC9C,OAAO,EAAE,UAAU,EAAE,MAAM,eAAe,CAAC;AAE3C,OAAO,EAAE,mBAAmB,EAAE,MAAM,6BAA6B,CAAC;AAClE,OAAO,EAAE,eAAe,EAAE,MAAM,yBAAyB,CAAC;AAC1D,OAAO,EAAE,gCAAgC,EAAE,MAAM,gCAAgC,CAAC;AAClF,OAAO,EAAE,UAAU,EAAE,MAAM,oBAAoB,CAAC;AAChD,OAAO,EAAE,oBAAoB,EAAE,kCAAkC,EAAE,MAAM,8BAA8B,CAAC;AAExG,OAAO,EAAE,UAAU,EAAE,MAAM,oBAAoB,CAAC;AAChD,OAAO,EAAE,oBAAoB,EAAE,MAAM,8BAA8B,CAAC;AACpE,OAAO,EAAE,UAAU,IAAI,iBAAiB,EAAE,MAAM,sBAAsB,CAAC;AAGvE,MAAM,UAAU,SAAS,CAAI,KAAyB;IACpD,IAAI,KAAK,YAAY,UAAU,EAAE;QAC/B,OAAO,KAAK,CAAC;KACd;IACD,IAAI,KAAK,IAAI,IAAI,EAAE;QACjB,IAAI,mBAAmB,CAAC,KAAK,CAAC,EAAE;YAC9B,OAAO,qBAAqB,CAAC,KAAK,CAAC,CAAC;SACrC;QACD,IAAI,WAAW,CAAC,KAAK,CAAC,EAAE;YACtB,OAAO,aAAa,CAAC,KAAK,CAAC,CAAC;SAC7B;QACD,IAAI,SAAS,CAAC,KAAK,CAAC,EAAE;YACpB,OAAO,WAAW,CAAC,KAAK,CAAC,CAAC;SAC3B;QACD,IAAI,eAAe,CAAC,KAAK,CAAC,EAAE;YAC1B,OAAO,iBAAiB,CAAC,KAAK,CAAC,CAAC;SACjC;QACD,IAAI,UAAU,CAAC,KAAK,CAAC,EAAE;YACrB,OAAO,YAAY,CAAC,KAAK,CAAC,CAAC;SAC5B;QACD,IAAI,oBAAoB,CAAC,KAAK,CAAC,EAAE;YAC/B,OAAO,sBAAsB,CAAC,KAAK,CAAC,CAAC;SACtC;KACF;IAED,MAAM,gCAAgC,CAAC,KAAK,CAAC,CAAC;AAChD,CAAC;AAMD,MAAM,UAAU,qBAAqB,CAAI,GAAQ;IAC/C,OAAO,IAAI,UAAU,CAAC,UAAC,UAAyB;QAC9C,IAAM,GAAG,GAAG,GAAG,CAAC,iBAAiB,CAAC,EAAE,CAAC;QACrC,IAAI,UAAU,CAAC,GAAG,CAAC,SAAS,CAAC,EAAE;YAC7B,OAAO,GAAG,CAAC,SAAS,CAAC,UAAU,CAAC,CAAC;SAClC;QAED,MAAM,IAAI,SAAS,CAAC,gEAAgE,CAAC,CAAC;IACxF,CAAC,CAAC,CAAC;AACL,CAAC;AASD,MAAM,UAAU,aAAa,CAAI,KAAmB;IAClD,OAAO,IAAI,UAAU,CAAC,UAAC,UAAyB;QAU9C,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,UAAU,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;SAC3B;QACD,UAAU,CAAC,QAAQ,EAAE,CAAC;IACxB,CAAC,CAAC,CAAC;AACL,CAAC;AAED,MAAM,UAAU,WAAW,CAAI,OAAuB;IACpD,OAAO,IAAI,UAAU,CAAC,UAAC,UAAyB;QAC9C,OAAO;aACJ,IAAI,CACH,UAAC,KAAK;YACJ,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE;gBACtB,UAAU,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;gBACvB,UAAU,CAAC,QAAQ,EAAE,CAAC;aACvB;QACH,CAAC,EACD,UAAC,GAAQ,IAAK,OAAA,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,EAArB,CAAqB,CACpC;aACA,IAAI,CAAC,IAAI,EAAE,oBAAoB,CAAC,CAAC;IACtC,CAAC,CAAC,CAAC;AACL,CAAC;AAED,MAAM,UAAU,YAAY,CAAI,QAAqB;IACnD,OAAO,IAAI,UAAU,CAAC,UAAC,UAAyB;;;YAC9C,KAAoB,IAAA,aAAA,SAAA,QAAQ,CAAA,kCAAA,wDAAE;gBAAzB,IAAM,KAAK,qBAAA;gBACd,UAAU,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;gBACvB,IAAI,UAAU,CAAC,MAAM,EAAE;oBACrB,OAAO;iBACR;aACF;;;;;;;;;QACD,UAAU,CAAC,QAAQ,EAAE,CAAC;IACxB,CAAC,CAAC,CAAC;AACL,CAAC;AAED,MAAM,UAAU,iBAAiB,CAAI,aAA+B;IAClE,OAAO,IAAI,UAAU,CAAC,UAAC,UAAyB;QAC9C,OAAO,CAAC,aAAa,EAAE,UAAU,CAAC,CAAC,KAAK,CAAC,UAAC,GAAG,IAAK,OAAA,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,EAArB,CAAqB,CAAC,CAAC;IAC3E,CAAC,CAAC,CAAC;AACL,CAAC;AAED,MAAM,UAAU,sBAAsB,CAAI,cAAqC;IAC7E,OAAO,iBAAiB,CAAC,kCAAkC,CAAC,cAAc,CAAC,CAAC,CAAC;AAC/E,CAAC;AAED,SAAe,OAAO,CAAI,aAA+B,EAAE,UAAyB;;;;;;;;;oBACxD,kBAAA,cAAA,aAAa,CAAA;;;;;oBAAtB,KAAK,0BAAA,CAAA;oBACpB,UAAU,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;oBAGvB,IAAI,UAAU,CAAC,MAAM,EAAE;wBACrB,WAAO;qBACR;;;;;;;;;;;;;;;;;;;;;oBAEH,UAAU,CAAC,QAAQ,EAAE,CAAC;;;;;CACvB"}
|
||||||
@ -0,0 +1,83 @@
|
|||||||
|
import { entityKind } from "../../entity.cjs";
|
||||||
|
import { QueryPromise } from "../../query-promise.cjs";
|
||||||
|
import type { SingleStoreDialect } from "../dialect.cjs";
|
||||||
|
import type { AnySingleStoreQueryResultHKT, PreparedQueryHKTBase, PreparedQueryKind, SingleStorePreparedQueryConfig, SingleStoreQueryResultHKT, SingleStoreQueryResultKind, SingleStoreSession } from "../session.cjs";
|
||||||
|
import type { SingleStoreTable } from "../table.cjs";
|
||||||
|
import type { Placeholder, Query, SQL, SQLWrapper } from "../../sql/sql.cjs";
|
||||||
|
import type { Subquery } from "../../subquery.cjs";
|
||||||
|
import type { ValueOrArray } from "../../utils.cjs";
|
||||||
|
import type { SingleStoreColumn } from "../columns/common.cjs";
|
||||||
|
import type { SelectedFieldsOrdered } from "./select.types.cjs";
|
||||||
|
export type SingleStoreDeleteWithout<T extends AnySingleStoreDeleteBase, TDynamic extends boolean, K extends keyof T & string> = TDynamic extends true ? T : Omit<SingleStoreDeleteBase<T['_']['table'], T['_']['queryResult'], T['_']['preparedQueryHKT'], TDynamic, T['_']['excludedMethods'] | K>, T['_']['excludedMethods'] | K>;
|
||||||
|
export type SingleStoreDelete<TTable extends SingleStoreTable = SingleStoreTable, TQueryResult extends SingleStoreQueryResultHKT = AnySingleStoreQueryResultHKT, TPreparedQueryHKT extends PreparedQueryHKTBase = PreparedQueryHKTBase> = SingleStoreDeleteBase<TTable, TQueryResult, TPreparedQueryHKT, true, never>;
|
||||||
|
export interface SingleStoreDeleteConfig {
|
||||||
|
where?: SQL | undefined;
|
||||||
|
limit?: number | Placeholder;
|
||||||
|
orderBy?: (SingleStoreColumn | SQL | SQL.Aliased)[];
|
||||||
|
table: SingleStoreTable;
|
||||||
|
returning?: SelectedFieldsOrdered;
|
||||||
|
withList?: Subquery[];
|
||||||
|
}
|
||||||
|
export type SingleStoreDeletePrepare<T extends AnySingleStoreDeleteBase> = PreparedQueryKind<T['_']['preparedQueryHKT'], SingleStorePreparedQueryConfig & {
|
||||||
|
execute: SingleStoreQueryResultKind<T['_']['queryResult'], never>;
|
||||||
|
iterator: never;
|
||||||
|
}, true>;
|
||||||
|
type SingleStoreDeleteDynamic<T extends AnySingleStoreDeleteBase> = SingleStoreDelete<T['_']['table'], T['_']['queryResult'], T['_']['preparedQueryHKT']>;
|
||||||
|
type AnySingleStoreDeleteBase = SingleStoreDeleteBase<any, any, any, any, any>;
|
||||||
|
export interface SingleStoreDeleteBase<TTable extends SingleStoreTable, TQueryResult extends SingleStoreQueryResultHKT, TPreparedQueryHKT extends PreparedQueryHKTBase, TDynamic extends boolean = false, TExcludedMethods extends string = never> extends QueryPromise<SingleStoreQueryResultKind<TQueryResult, never>> {
|
||||||
|
readonly _: {
|
||||||
|
readonly table: TTable;
|
||||||
|
readonly queryResult: TQueryResult;
|
||||||
|
readonly preparedQueryHKT: TPreparedQueryHKT;
|
||||||
|
readonly dynamic: TDynamic;
|
||||||
|
readonly excludedMethods: TExcludedMethods;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
export declare class SingleStoreDeleteBase<TTable extends SingleStoreTable, TQueryResult extends SingleStoreQueryResultHKT, TPreparedQueryHKT extends PreparedQueryHKTBase, TDynamic extends boolean = false, TExcludedMethods extends string = never> extends QueryPromise<SingleStoreQueryResultKind<TQueryResult, never>> implements SQLWrapper {
|
||||||
|
private table;
|
||||||
|
private session;
|
||||||
|
private dialect;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
private config;
|
||||||
|
constructor(table: TTable, session: SingleStoreSession, dialect: SingleStoreDialect, withList?: Subquery[]);
|
||||||
|
/**
|
||||||
|
* Adds a `where` clause to the query.
|
||||||
|
*
|
||||||
|
* Calling this method will delete only those rows that fulfill a specified condition.
|
||||||
|
*
|
||||||
|
* See docs: {@link https://orm.drizzle.team/docs/delete}
|
||||||
|
*
|
||||||
|
* @param where the `where` clause.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* You can use conditional operators and `sql function` to filter the rows to be deleted.
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Delete all cars with green color
|
||||||
|
* db.delete(cars).where(eq(cars.color, 'green'));
|
||||||
|
* // or
|
||||||
|
* db.delete(cars).where(sql`${cars.color} = 'green'`)
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* You can logically combine conditional operators with `and()` and `or()` operators:
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* // Delete all BMW cars with a green color
|
||||||
|
* db.delete(cars).where(and(eq(cars.color, 'green'), eq(cars.brand, 'BMW')));
|
||||||
|
*
|
||||||
|
* // Delete all cars with the green or blue color
|
||||||
|
* db.delete(cars).where(or(eq(cars.color, 'green'), eq(cars.color, 'blue')));
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
where(where: SQL | undefined): SingleStoreDeleteWithout<this, TDynamic, 'where'>;
|
||||||
|
orderBy(builder: (deleteTable: TTable) => ValueOrArray<SingleStoreColumn | SQL | SQL.Aliased>): SingleStoreDeleteWithout<this, TDynamic, 'orderBy'>;
|
||||||
|
orderBy(...columns: (SingleStoreColumn | SQL | SQL.Aliased)[]): SingleStoreDeleteWithout<this, TDynamic, 'orderBy'>;
|
||||||
|
limit(limit: number | Placeholder): SingleStoreDeleteWithout<this, TDynamic, 'limit'>;
|
||||||
|
toSQL(): Query;
|
||||||
|
prepare(): SingleStoreDeletePrepare<this>;
|
||||||
|
execute: ReturnType<this['prepare']>['execute'];
|
||||||
|
private createIterator;
|
||||||
|
iterator: ReturnType<this["prepare"]>["iterator"];
|
||||||
|
$dynamic(): SingleStoreDeleteDynamic<this>;
|
||||||
|
}
|
||||||
|
export {};
|
||||||
@ -0,0 +1,89 @@
|
|||||||
|
# has-flag [](https://travis-ci.org/sindresorhus/has-flag)
|
||||||
|
|
||||||
|
> Check if [`argv`](https://nodejs.org/docs/latest/api/process.html#process_process_argv) has a specific flag
|
||||||
|
|
||||||
|
Correctly stops looking after an `--` argument terminator.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<div align="center">
|
||||||
|
<b>
|
||||||
|
<a href="https://tidelift.com/subscription/pkg/npm-has-flag?utm_source=npm-has-flag&utm_medium=referral&utm_campaign=readme">Get professional support for this package with a Tidelift subscription</a>
|
||||||
|
</b>
|
||||||
|
<br>
|
||||||
|
<sub>
|
||||||
|
Tidelift helps make open source sustainable for maintainers while giving companies<br>assurances about security, maintenance, and licensing for their dependencies.
|
||||||
|
</sub>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```
|
||||||
|
$ npm install has-flag
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```js
|
||||||
|
// foo.js
|
||||||
|
const hasFlag = require('has-flag');
|
||||||
|
|
||||||
|
hasFlag('unicorn');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('--unicorn');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('f');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('-f');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('foo=bar');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
hasFlag('foo');
|
||||||
|
//=> false
|
||||||
|
|
||||||
|
hasFlag('rainbow');
|
||||||
|
//=> false
|
||||||
|
```
|
||||||
|
|
||||||
|
```
|
||||||
|
$ node foo.js -f --unicorn --foo=bar -- --rainbow
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
### hasFlag(flag, [argv])
|
||||||
|
|
||||||
|
Returns a boolean for whether the flag exists.
|
||||||
|
|
||||||
|
#### flag
|
||||||
|
|
||||||
|
Type: `string`
|
||||||
|
|
||||||
|
CLI flag to look for. The `--` prefix is optional.
|
||||||
|
|
||||||
|
#### argv
|
||||||
|
|
||||||
|
Type: `string[]`<br>
|
||||||
|
Default: `process.argv`
|
||||||
|
|
||||||
|
CLI arguments.
|
||||||
|
|
||||||
|
|
||||||
|
## Security
|
||||||
|
|
||||||
|
To report a security vulnerability, please use the [Tidelift security contact](https://tidelift.com/security). Tidelift will coordinate the fix and disclosure.
|
||||||
|
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT © [Sindre Sorhus](https://sindresorhus.com)
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/gel-core/query-builders/raw.ts"],"sourcesContent":["import { entityKind } from '~/entity.ts';\nimport { QueryPromise } from '~/query-promise.ts';\nimport type { RunnableQuery } from '~/runnable-query.ts';\nimport type { PreparedQuery } from '~/session.ts';\nimport type { Query, SQL, SQLWrapper } from '~/sql/sql.ts';\n\nexport interface GelRaw<TResult> extends QueryPromise<TResult>, RunnableQuery<TResult, 'gel'>, SQLWrapper {}\n\nexport class GelRaw<TResult> extends QueryPromise<TResult>\n\timplements RunnableQuery<TResult, 'gel'>, SQLWrapper, PreparedQuery\n{\n\tstatic override readonly [entityKind]: string = 'GelRaw';\n\n\tdeclare readonly _: {\n\t\treadonly dialect: 'gel';\n\t\treadonly result: TResult;\n\t};\n\n\tconstructor(\n\t\tpublic execute: () => Promise<TResult>,\n\t\tprivate sql: SQL,\n\t\tprivate query: Query,\n\t\tprivate mapBatchResult: (result: unknown) => unknown,\n\t) {\n\t\tsuper();\n\t}\n\n\t/** @internal */\n\tgetSQL() {\n\t\treturn this.sql;\n\t}\n\n\tgetQuery() {\n\t\treturn this.query;\n\t}\n\n\tmapResult(result: unknown, isFromBatch?: boolean) {\n\t\treturn isFromBatch ? this.mapBatchResult(result) : result;\n\t}\n\n\t_prepare(): PreparedQuery {\n\t\treturn this;\n\t}\n\n\t/** @internal */\n\tisResponseInArrayMode() {\n\t\treturn false;\n\t}\n}\n"],"mappings":"AAAA,SAAS,kBAAkB;AAC3B,SAAS,oBAAoB;AAOtB,MAAM,eAAwB,aAErC;AAAA,EAQC,YACQ,SACC,KACA,OACA,gBACP;AACD,UAAM;AALC;AACC;AACA;AACA;AAAA,EAGT;AAAA,EAdA,QAA0B,UAAU,IAAY;AAAA;AAAA,EAiBhD,SAAS;AACR,WAAO,KAAK;AAAA,EACb;AAAA,EAEA,WAAW;AACV,WAAO,KAAK;AAAA,EACb;AAAA,EAEA,UAAU,QAAiB,aAAuB;AACjD,WAAO,cAAc,KAAK,eAAe,MAAM,IAAI;AAAA,EACpD;AAAA,EAEA,WAA0B;AACzB,WAAO;AAAA,EACR;AAAA;AAAA,EAGA,wBAAwB;AACvB,WAAO;AAAA,EACR;AACD;","names":[]}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"webSocket.js","sourceRoot":"","sources":["../../../../../src/internal/observable/dom/webSocket.ts"],"names":[],"mappings":";;;AAAA,uDAA8E;AA8J9E,SAAgB,SAAS,CAAI,iBAAqD;IAChF,OAAO,IAAI,mCAAgB,CAAI,iBAAiB,CAAC,CAAC;AACpD,CAAC;AAFD,8BAEC"}
|
||||||
@ -0,0 +1,53 @@
|
|||||||
|
import { AsyncScheduler } from './AsyncScheduler';
|
||||||
|
/**
|
||||||
|
*
|
||||||
|
* Async Scheduler
|
||||||
|
*
|
||||||
|
* <span class="informal">Schedule task as if you used setTimeout(task, duration)</span>
|
||||||
|
*
|
||||||
|
* `async` scheduler schedules tasks asynchronously, by putting them on the JavaScript
|
||||||
|
* event loop queue. It is best used to delay tasks in time or to schedule tasks repeating
|
||||||
|
* in intervals.
|
||||||
|
*
|
||||||
|
* If you just want to "defer" task, that is to perform it right after currently
|
||||||
|
* executing synchronous code ends (commonly achieved by `setTimeout(deferredTask, 0)`),
|
||||||
|
* better choice will be the {@link asapScheduler} scheduler.
|
||||||
|
*
|
||||||
|
* ## Examples
|
||||||
|
* Use async scheduler to delay task
|
||||||
|
* ```ts
|
||||||
|
* import { asyncScheduler } from 'rxjs';
|
||||||
|
*
|
||||||
|
* const task = () => console.log('it works!');
|
||||||
|
*
|
||||||
|
* asyncScheduler.schedule(task, 2000);
|
||||||
|
*
|
||||||
|
* // After 2 seconds logs:
|
||||||
|
* // "it works!"
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* Use async scheduler to repeat task in intervals
|
||||||
|
* ```ts
|
||||||
|
* import { asyncScheduler } from 'rxjs';
|
||||||
|
*
|
||||||
|
* function task(state) {
|
||||||
|
* console.log(state);
|
||||||
|
* this.schedule(state + 1, 1000); // `this` references currently executing Action,
|
||||||
|
* // which we reschedule with new state and delay
|
||||||
|
* }
|
||||||
|
*
|
||||||
|
* asyncScheduler.schedule(task, 3000, 0);
|
||||||
|
*
|
||||||
|
* // Logs:
|
||||||
|
* // 0 after 3s
|
||||||
|
* // 1 after 4s
|
||||||
|
* // 2 after 5s
|
||||||
|
* // 3 after 6s
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export declare const asyncScheduler: AsyncScheduler;
|
||||||
|
/**
|
||||||
|
* @deprecated Renamed to {@link asyncScheduler}. Will be removed in v8.
|
||||||
|
*/
|
||||||
|
export declare const async: AsyncScheduler;
|
||||||
|
//# sourceMappingURL=async.d.ts.map
|
||||||
@ -0,0 +1,12 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.subscribeOn = void 0;
|
||||||
|
var lift_1 = require("../util/lift");
|
||||||
|
function subscribeOn(scheduler, delay) {
|
||||||
|
if (delay === void 0) { delay = 0; }
|
||||||
|
return lift_1.operate(function (source, subscriber) {
|
||||||
|
subscriber.add(scheduler.schedule(function () { return source.subscribe(subscriber); }, delay));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
exports.subscribeOn = subscribeOn;
|
||||||
|
//# sourceMappingURL=subscribeOn.js.map
|
||||||
@ -0,0 +1,8 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { innerFrom } from './innerFrom';
|
||||||
|
export function defer(observableFactory) {
|
||||||
|
return new Observable((subscriber) => {
|
||||||
|
innerFrom(observableFactory()).subscribe(subscriber);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
//# sourceMappingURL=defer.js.map
|
||||||
@ -0,0 +1,261 @@
|
|||||||
|
"use strict";
|
||||||
|
var __create = Object.create;
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __getProtoOf = Object.getPrototypeOf;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||||
|
// If the importer is in node compatibility mode or this is not an ESM
|
||||||
|
// file that has been converted to a CommonJS file using a Babel-
|
||||||
|
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||||
|
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||||
|
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||||
|
mod
|
||||||
|
));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var firefoxPrefs_exports = {};
|
||||||
|
__export(firefoxPrefs_exports, {
|
||||||
|
createProfile: () => createProfile
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(firefoxPrefs_exports);
|
||||||
|
var import_fs = __toESM(require("fs"));
|
||||||
|
var import_path = __toESM(require("path"));
|
||||||
|
/**
|
||||||
|
* @license
|
||||||
|
* Copyright 2023 Google Inc.
|
||||||
|
* SPDX-License-Identifier: Apache-2.0
|
||||||
|
*/
|
||||||
|
async function createProfile(options) {
|
||||||
|
if (!import_fs.default.existsSync(options.path)) {
|
||||||
|
await import_fs.default.promises.mkdir(options.path, {
|
||||||
|
recursive: true
|
||||||
|
});
|
||||||
|
}
|
||||||
|
await writePreferences({
|
||||||
|
preferences: {
|
||||||
|
...defaultProfilePreferences(options.preferences),
|
||||||
|
...options.preferences
|
||||||
|
},
|
||||||
|
path: options.path
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function defaultProfilePreferences(extraPrefs) {
|
||||||
|
const server = "dummy.test";
|
||||||
|
const defaultPrefs = {
|
||||||
|
// Make sure Shield doesn't hit the network.
|
||||||
|
"app.normandy.api_url": "",
|
||||||
|
// Disable Firefox old build background check
|
||||||
|
"app.update.checkInstallTime": false,
|
||||||
|
// Disable automatically upgrading Firefox
|
||||||
|
"app.update.disabledForTesting": true,
|
||||||
|
// Increase the APZ content response timeout to 1 minute
|
||||||
|
"apz.content_response_timeout": 6e4,
|
||||||
|
// Prevent various error message on the console
|
||||||
|
// jest-puppeteer asserts that no error message is emitted by the console
|
||||||
|
"browser.contentblocking.features.standard": "-tp,tpPrivate,cookieBehavior0,-cm,-fp",
|
||||||
|
// Enable the dump function: which sends messages to the system
|
||||||
|
// console
|
||||||
|
// https://bugzilla.mozilla.org/show_bug.cgi?id=1543115
|
||||||
|
"browser.dom.window.dump.enabled": true,
|
||||||
|
// Make sure newtab weather doesn't hit the network to retrieve weather data.
|
||||||
|
"browser.newtabpage.activity-stream.discoverystream.region-weather-config": "",
|
||||||
|
// Make sure newtab wallpapers don't hit the network to retrieve wallpaper data.
|
||||||
|
"browser.newtabpage.activity-stream.newtabWallpapers.enabled": false,
|
||||||
|
"browser.newtabpage.activity-stream.newtabWallpapers.v2.enabled": false,
|
||||||
|
// Make sure Topsites doesn't hit the network to retrieve sponsored tiles.
|
||||||
|
"browser.newtabpage.activity-stream.showSponsoredTopSites": false,
|
||||||
|
// Disable topstories
|
||||||
|
"browser.newtabpage.activity-stream.feeds.system.topstories": false,
|
||||||
|
// Always display a blank page
|
||||||
|
"browser.newtabpage.enabled": false,
|
||||||
|
// Background thumbnails in particular cause grief: and disabling
|
||||||
|
// thumbnails in general cannot hurt
|
||||||
|
"browser.pagethumbnails.capturing_disabled": true,
|
||||||
|
// Disable safebrowsing components.
|
||||||
|
"browser.safebrowsing.blockedURIs.enabled": false,
|
||||||
|
"browser.safebrowsing.downloads.enabled": false,
|
||||||
|
"browser.safebrowsing.malware.enabled": false,
|
||||||
|
"browser.safebrowsing.phishing.enabled": false,
|
||||||
|
// Disable updates to search engines.
|
||||||
|
"browser.search.update": false,
|
||||||
|
// Do not restore the last open set of tabs if the browser has crashed
|
||||||
|
"browser.sessionstore.resume_from_crash": false,
|
||||||
|
// Skip check for default browser on startup
|
||||||
|
"browser.shell.checkDefaultBrowser": false,
|
||||||
|
// Disable newtabpage
|
||||||
|
"browser.startup.homepage": "about:blank",
|
||||||
|
// Do not redirect user when a milstone upgrade of Firefox is detected
|
||||||
|
"browser.startup.homepage_override.mstone": "ignore",
|
||||||
|
// Start with a blank page about:blank
|
||||||
|
"browser.startup.page": 0,
|
||||||
|
// Do not allow background tabs to be zombified on Android: otherwise for
|
||||||
|
// tests that open additional tabs: the test harness tab itself might get
|
||||||
|
// unloaded
|
||||||
|
"browser.tabs.disableBackgroundZombification": false,
|
||||||
|
// Do not warn when closing all other open tabs
|
||||||
|
"browser.tabs.warnOnCloseOtherTabs": false,
|
||||||
|
// Do not warn when multiple tabs will be opened
|
||||||
|
"browser.tabs.warnOnOpen": false,
|
||||||
|
// Do not automatically offer translations, as tests do not expect this.
|
||||||
|
"browser.translations.automaticallyPopup": false,
|
||||||
|
// Disable the UI tour.
|
||||||
|
"browser.uitour.enabled": false,
|
||||||
|
// Turn off search suggestions in the location bar so as not to trigger
|
||||||
|
// network connections.
|
||||||
|
"browser.urlbar.suggest.searches": false,
|
||||||
|
// Disable first run splash page on Windows 10
|
||||||
|
"browser.usedOnWindows10.introURL": "",
|
||||||
|
// Do not warn on quitting Firefox
|
||||||
|
"browser.warnOnQuit": false,
|
||||||
|
// Defensively disable data reporting systems
|
||||||
|
"datareporting.healthreport.documentServerURI": `http://${server}/dummy/healthreport/`,
|
||||||
|
"datareporting.healthreport.logging.consoleEnabled": false,
|
||||||
|
"datareporting.healthreport.service.enabled": false,
|
||||||
|
"datareporting.healthreport.service.firstRun": false,
|
||||||
|
"datareporting.healthreport.uploadEnabled": false,
|
||||||
|
// Do not show datareporting policy notifications which can interfere with tests
|
||||||
|
"datareporting.policy.dataSubmissionEnabled": false,
|
||||||
|
"datareporting.policy.dataSubmissionPolicyBypassNotification": true,
|
||||||
|
// DevTools JSONViewer sometimes fails to load dependencies with its require.js.
|
||||||
|
// This doesn't affect Puppeteer but spams console (Bug 1424372)
|
||||||
|
"devtools.jsonview.enabled": false,
|
||||||
|
// Disable popup-blocker
|
||||||
|
"dom.disable_open_during_load": false,
|
||||||
|
// Enable the support for File object creation in the content process
|
||||||
|
// Required for |Page.setFileInputFiles| protocol method.
|
||||||
|
"dom.file.createInChild": true,
|
||||||
|
// Disable the ProcessHangMonitor
|
||||||
|
"dom.ipc.reportProcessHangs": false,
|
||||||
|
// Disable slow script dialogues
|
||||||
|
"dom.max_chrome_script_run_time": 0,
|
||||||
|
"dom.max_script_run_time": 0,
|
||||||
|
// Disable background timer throttling to allow tests to run in parallel
|
||||||
|
// without a decrease in performance.
|
||||||
|
"dom.min_background_timeout_value": 0,
|
||||||
|
"dom.min_background_timeout_value_without_budget_throttling": 0,
|
||||||
|
"dom.timeout.enable_budget_timer_throttling": false,
|
||||||
|
// Disable HTTPS-First upgrades
|
||||||
|
"dom.security.https_first": false,
|
||||||
|
// Only load extensions from the application and user profile
|
||||||
|
// AddonManager.SCOPE_PROFILE + AddonManager.SCOPE_APPLICATION
|
||||||
|
"extensions.autoDisableScopes": 0,
|
||||||
|
"extensions.enabledScopes": 5,
|
||||||
|
// Disable metadata caching for installed add-ons by default
|
||||||
|
"extensions.getAddons.cache.enabled": false,
|
||||||
|
// Disable installing any distribution extensions or add-ons.
|
||||||
|
"extensions.installDistroAddons": false,
|
||||||
|
// Disabled screenshots extension
|
||||||
|
"extensions.screenshots.disabled": true,
|
||||||
|
// Turn off extension updates so they do not bother tests
|
||||||
|
"extensions.update.enabled": false,
|
||||||
|
// Turn off extension updates so they do not bother tests
|
||||||
|
"extensions.update.notifyUser": false,
|
||||||
|
// Make sure opening about:addons will not hit the network
|
||||||
|
"extensions.webservice.discoverURL": `http://${server}/dummy/discoveryURL`,
|
||||||
|
// Allow the application to have focus even it runs in the background
|
||||||
|
"focusmanager.testmode": true,
|
||||||
|
// Disable useragent updates
|
||||||
|
"general.useragent.updates.enabled": false,
|
||||||
|
// Always use network provider for geolocation tests so we bypass the
|
||||||
|
// macOS dialog raised by the corelocation provider
|
||||||
|
"geo.provider.testing": true,
|
||||||
|
// Do not scan Wifi
|
||||||
|
"geo.wifi.scan": false,
|
||||||
|
// No hang monitor
|
||||||
|
"hangmonitor.timeout": 0,
|
||||||
|
// Show chrome errors and warnings in the error console
|
||||||
|
"javascript.options.showInConsole": true,
|
||||||
|
// Do not throttle rendering (requestAnimationFrame) in background tabs
|
||||||
|
"layout.testing.top-level-always-active": true,
|
||||||
|
// Disable download and usage of OpenH264: and Widevine plugins
|
||||||
|
"media.gmp-manager.updateEnabled": false,
|
||||||
|
// Disable the GFX sanity window
|
||||||
|
"media.sanity-test.disabled": true,
|
||||||
|
// Disable connectivity service pings
|
||||||
|
"network.connectivity-service.enabled": false,
|
||||||
|
// Disable experimental feature that is only available in Nightly
|
||||||
|
"network.cookie.sameSite.laxByDefault": false,
|
||||||
|
// Do not prompt for temporary redirects
|
||||||
|
"network.http.prompt-temp-redirect": false,
|
||||||
|
// Disable speculative connections so they are not reported as leaking
|
||||||
|
// when they are hanging around
|
||||||
|
"network.http.speculative-parallel-limit": 0,
|
||||||
|
// Do not automatically switch between offline and online
|
||||||
|
"network.manage-offline-status": false,
|
||||||
|
// Make sure SNTP requests do not hit the network
|
||||||
|
"network.sntp.pools": server,
|
||||||
|
// Disable Flash.
|
||||||
|
"plugin.state.flash": 0,
|
||||||
|
"privacy.trackingprotection.enabled": false,
|
||||||
|
// Can be removed once Firefox 89 is no longer supported
|
||||||
|
// https://bugzilla.mozilla.org/show_bug.cgi?id=1710839
|
||||||
|
"remote.enabled": true,
|
||||||
|
// Don't do network connections for mitm priming
|
||||||
|
"security.certerrors.mitm.priming.enabled": false,
|
||||||
|
// Local documents have access to all other local documents,
|
||||||
|
// including directory listings
|
||||||
|
"security.fileuri.strict_origin_policy": false,
|
||||||
|
// Do not wait for the notification button security delay
|
||||||
|
"security.notification_enable_delay": 0,
|
||||||
|
// Do not automatically fill sign-in forms with known usernames and
|
||||||
|
// passwords
|
||||||
|
"signon.autofillForms": false,
|
||||||
|
// Disable password capture, so that tests that include forms are not
|
||||||
|
// influenced by the presence of the persistent doorhanger notification
|
||||||
|
"signon.rememberSignons": false,
|
||||||
|
// Disable first-run welcome page
|
||||||
|
"startup.homepage_welcome_url": "about:blank",
|
||||||
|
// Disable first-run welcome page
|
||||||
|
"startup.homepage_welcome_url.additional": "",
|
||||||
|
// Disable browser animations (tabs, fullscreen, sliding alerts)
|
||||||
|
"toolkit.cosmeticAnimations.enabled": false,
|
||||||
|
// Prevent starting into safe mode after application crashes
|
||||||
|
"toolkit.startup.max_resumed_crashes": -1,
|
||||||
|
// Enable TestUtils
|
||||||
|
"dom.testing.testutils.enabled": true
|
||||||
|
};
|
||||||
|
return Object.assign(defaultPrefs, extraPrefs);
|
||||||
|
}
|
||||||
|
async function writePreferences(options) {
|
||||||
|
const prefsPath = import_path.default.join(options.path, "prefs.js");
|
||||||
|
const lines = Object.entries(options.preferences).map(([key, value]) => {
|
||||||
|
return `user_pref(${JSON.stringify(key)}, ${JSON.stringify(value)});`;
|
||||||
|
});
|
||||||
|
const result = await Promise.allSettled([
|
||||||
|
import_fs.default.promises.writeFile(import_path.default.join(options.path, "user.js"), lines.join("\n")),
|
||||||
|
// Create a backup of the preferences file if it already exitsts.
|
||||||
|
import_fs.default.promises.access(prefsPath, import_fs.default.constants.F_OK).then(
|
||||||
|
async () => {
|
||||||
|
await import_fs.default.promises.copyFile(
|
||||||
|
prefsPath,
|
||||||
|
import_path.default.join(options.path, "prefs.js.playwright")
|
||||||
|
);
|
||||||
|
},
|
||||||
|
// Swallow only if file does not exist
|
||||||
|
() => {
|
||||||
|
}
|
||||||
|
)
|
||||||
|
]);
|
||||||
|
for (const command of result) {
|
||||||
|
if (command.status === "rejected") {
|
||||||
|
throw command.reason;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
createProfile
|
||||||
|
});
|
||||||
@ -0,0 +1,5 @@
|
|||||||
|
const MySqlViewConfig = Symbol.for("drizzle:MySqlViewConfig");
|
||||||
|
export {
|
||||||
|
MySqlViewConfig
|
||||||
|
};
|
||||||
|
//# sourceMappingURL=view-common.js.map
|
||||||
@ -0,0 +1,23 @@
|
|||||||
|
import type { ColumnBuilderBaseConfig } from "../../column-builder.js";
|
||||||
|
import type { ColumnBaseConfig } from "../../column.js";
|
||||||
|
import { entityKind } from "../../entity.js";
|
||||||
|
import { GelColumn } from "./common.js";
|
||||||
|
import { GelIntColumnBaseBuilder } from "./int.common.js";
|
||||||
|
export type GelIntegerBuilderInitial<TName extends string> = GelIntegerBuilder<{
|
||||||
|
name: TName;
|
||||||
|
dataType: 'number';
|
||||||
|
columnType: 'GelInteger';
|
||||||
|
data: number;
|
||||||
|
driverParam: number;
|
||||||
|
enumValues: undefined;
|
||||||
|
}>;
|
||||||
|
export declare class GelIntegerBuilder<T extends ColumnBuilderBaseConfig<'number', 'GelInteger'>> extends GelIntColumnBaseBuilder<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name']);
|
||||||
|
}
|
||||||
|
export declare class GelInteger<T extends ColumnBaseConfig<'number', 'GelInteger'>> extends GelColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
getSQLType(): string;
|
||||||
|
}
|
||||||
|
export declare function integer(): GelIntegerBuilderInitial<''>;
|
||||||
|
export declare function integer<TName extends string>(name: TName): GelIntegerBuilderInitial<TName>;
|
||||||
@ -0,0 +1,4 @@
|
|||||||
|
import type { ReverseSegment, SourceMapSegment } from './sourcemap-segment.mts';
|
||||||
|
export type Source = ReverseSegment[][];
|
||||||
|
export default function buildBySources(decoded: readonly SourceMapSegment[][], memos: unknown[]): Source[];
|
||||||
|
//# sourceMappingURL=by-source.d.ts.map
|
||||||
@ -0,0 +1,749 @@
|
|||||||
|
<h1 align="center">Picomatch</h1>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<a href="https://npmjs.org/package/picomatch">
|
||||||
|
<img src="https://img.shields.io/npm/v/picomatch.svg" alt="version">
|
||||||
|
</a>
|
||||||
|
<a href="https://github.com/micromatch/picomatch/actions?workflow=Tests">
|
||||||
|
<img src="https://github.com/micromatch/picomatch/workflows/Tests/badge.svg" alt="test status">
|
||||||
|
</a>
|
||||||
|
<a href="https://coveralls.io/github/micromatch/picomatch">
|
||||||
|
<img src="https://img.shields.io/coveralls/github/micromatch/picomatch/master.svg" alt="coverage status">
|
||||||
|
</a>
|
||||||
|
<a href="https://npmjs.org/package/picomatch">
|
||||||
|
<img src="https://img.shields.io/npm/dm/picomatch.svg" alt="downloads">
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<strong>Blazing fast and accurate glob matcher written in JavaScript.</strong></br>
|
||||||
|
<em>No dependencies and full support for standard and extended Bash glob features, including braces, extglobs, POSIX brackets, and regular expressions.</em>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Why picomatch?
|
||||||
|
|
||||||
|
* **Lightweight** - No dependencies
|
||||||
|
* **Minimal** - Tiny API surface. Main export is a function that takes a glob pattern and returns a matcher function.
|
||||||
|
* **Fast** - Loads in about 2ms (that's several times faster than a [single frame of a HD movie](http://www.endmemo.com/sconvert/framespersecondframespermillisecond.php) at 60fps)
|
||||||
|
* **Performant** - Use the returned matcher function to speed up repeat matching (like when watching files)
|
||||||
|
* **Accurate matching** - Using wildcards (`*` and `?`), globstars (`**`) for nested directories, [advanced globbing](#advanced-globbing) with extglobs, braces, and POSIX brackets, and support for escaping special characters with `\` or quotes.
|
||||||
|
* **Well tested** - Thousands of unit tests
|
||||||
|
|
||||||
|
See the [library comparison](#library-comparisons) to other libraries.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
|
||||||
|
<details><summary> Click to expand </summary>
|
||||||
|
|
||||||
|
- [Install](#install)
|
||||||
|
- [Usage](#usage)
|
||||||
|
- [API](#api)
|
||||||
|
* [picomatch](#picomatch)
|
||||||
|
* [.test](#test)
|
||||||
|
* [.matchBase](#matchbase)
|
||||||
|
* [.isMatch](#ismatch)
|
||||||
|
* [.parse](#parse)
|
||||||
|
* [.scan](#scan)
|
||||||
|
* [.compileRe](#compilere)
|
||||||
|
* [.makeRe](#makere)
|
||||||
|
* [.toRegex](#toregex)
|
||||||
|
- [Options](#options)
|
||||||
|
* [Picomatch options](#picomatch-options)
|
||||||
|
* [Scan Options](#scan-options)
|
||||||
|
* [Options Examples](#options-examples)
|
||||||
|
- [Globbing features](#globbing-features)
|
||||||
|
* [Basic globbing](#basic-globbing)
|
||||||
|
* [Advanced globbing](#advanced-globbing)
|
||||||
|
* [Braces](#braces)
|
||||||
|
* [Matching special characters as literals](#matching-special-characters-as-literals)
|
||||||
|
- [Library Comparisons](#library-comparisons)
|
||||||
|
- [Benchmarks](#benchmarks)
|
||||||
|
- [Philosophies](#philosophies)
|
||||||
|
- [About](#about)
|
||||||
|
* [Author](#author)
|
||||||
|
* [License](#license)
|
||||||
|
|
||||||
|
_(TOC generated by [verb](https://github.com/verbose/verb) using [markdown-toc](https://github.com/jonschlinkert/markdown-toc))_
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Install with [npm](https://www.npmjs.com/):
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save picomatch
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
The main export is a function that takes a glob pattern and an options object and returns a function for matching strings.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const pm = require('picomatch');
|
||||||
|
const isMatch = pm('*.js');
|
||||||
|
|
||||||
|
console.log(isMatch('abcd')); //=> false
|
||||||
|
console.log(isMatch('a.js')); //=> true
|
||||||
|
console.log(isMatch('a.md')); //=> false
|
||||||
|
console.log(isMatch('a/b.js')); //=> false
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
### [picomatch](lib/picomatch.js#L31)
|
||||||
|
|
||||||
|
Creates a matcher function from one or more glob patterns. The returned function takes a string to match as its first argument, and returns true if the string is a match. The returned matcher function also takes a boolean as the second argument that, when true, returns an object with additional information.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `globs` **{String|Array}**: One or more glob patterns.
|
||||||
|
* `options` **{Object=}**
|
||||||
|
* `returns` **{Function=}**: Returns a matcher function.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch(glob[, options]);
|
||||||
|
|
||||||
|
const isMatch = picomatch('*.!(*a)');
|
||||||
|
console.log(isMatch('a.a')); //=> false
|
||||||
|
console.log(isMatch('a.b')); //=> true
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example without node.js**
|
||||||
|
|
||||||
|
For environments without `node.js`, `picomatch/posix` provides you a dependency-free matcher, without automatic OS detection.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch/posix');
|
||||||
|
// the same API, defaulting to posix paths
|
||||||
|
const isMatch = picomatch('a/*');
|
||||||
|
console.log(isMatch('a\\b')); //=> false
|
||||||
|
console.log(isMatch('a/b')); //=> true
|
||||||
|
|
||||||
|
// you can still configure the matcher function to accept windows paths
|
||||||
|
const isMatch = picomatch('a/*', { options: windows });
|
||||||
|
console.log(isMatch('a\\b')); //=> true
|
||||||
|
console.log(isMatch('a/b')); //=> true
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.test](lib/picomatch.js#L116)
|
||||||
|
|
||||||
|
Test `input` with the given `regex`. This is used by the main `picomatch()` function to test the input string.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `input` **{String}**: String to test.
|
||||||
|
* `regex` **{RegExp}**
|
||||||
|
* `returns` **{Object}**: Returns an object with matching info.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.test(input, regex[, options]);
|
||||||
|
|
||||||
|
console.log(picomatch.test('foo/bar', /^(?:([^/]*?)\/([^/]*?))$/));
|
||||||
|
// { isMatch: true, match: [ 'foo/', 'foo', 'bar' ], output: 'foo/bar' }
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.matchBase](lib/picomatch.js#L160)
|
||||||
|
|
||||||
|
Match the basename of a filepath.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `input` **{String}**: String to test.
|
||||||
|
* `glob` **{RegExp|String}**: Glob pattern or regex created by [.makeRe](#makeRe).
|
||||||
|
* `returns` **{Boolean}**
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.matchBase(input, glob[, options]);
|
||||||
|
console.log(picomatch.matchBase('foo/bar.js', '*.js'); // true
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.isMatch](lib/picomatch.js#L182)
|
||||||
|
|
||||||
|
Returns true if **any** of the given glob `patterns` match the specified `string`.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* **{String|Array}**: str The string to test.
|
||||||
|
* **{String|Array}**: patterns One or more glob patterns to use for matching.
|
||||||
|
* **{Object}**: See available [options](#options).
|
||||||
|
* `returns` **{Boolean}**: Returns true if any patterns match `str`
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.isMatch(string, patterns[, options]);
|
||||||
|
|
||||||
|
console.log(picomatch.isMatch('a.a', ['b.*', '*.a'])); //=> true
|
||||||
|
console.log(picomatch.isMatch('a.a', 'b.*')); //=> false
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.parse](lib/picomatch.js#L198)
|
||||||
|
|
||||||
|
Parse a glob pattern to create the source string for a regular expression.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `pattern` **{String}**
|
||||||
|
* `options` **{Object}**
|
||||||
|
* `returns` **{Object}**: Returns an object with useful properties and output to be used as a regex source string.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
const result = picomatch.parse(pattern[, options]);
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.scan](lib/picomatch.js#L230)
|
||||||
|
|
||||||
|
Scan a glob pattern to separate the pattern into segments.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `input` **{String}**: Glob pattern to scan.
|
||||||
|
* `options` **{Object}**
|
||||||
|
* `returns` **{Object}**: Returns an object with
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.scan(input[, options]);
|
||||||
|
|
||||||
|
const result = picomatch.scan('!./foo/*.js');
|
||||||
|
console.log(result);
|
||||||
|
{ prefix: '!./',
|
||||||
|
input: '!./foo/*.js',
|
||||||
|
start: 3,
|
||||||
|
base: 'foo',
|
||||||
|
glob: '*.js',
|
||||||
|
isBrace: false,
|
||||||
|
isBracket: false,
|
||||||
|
isGlob: true,
|
||||||
|
isExtglob: false,
|
||||||
|
isGlobstar: false,
|
||||||
|
negated: true }
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.compileRe](lib/picomatch.js#L244)
|
||||||
|
|
||||||
|
Compile a regular expression from the `state` object returned by the
|
||||||
|
[parse()](#parse) method.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `state` **{Object}**
|
||||||
|
* `options` **{Object}**
|
||||||
|
* `returnOutput` **{Boolean}**: Intended for implementors, this argument allows you to return the raw output from the parser.
|
||||||
|
* `returnState` **{Boolean}**: Adds the state to a `state` property on the returned regex. Useful for implementors and debugging.
|
||||||
|
* `returns` **{RegExp}**
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
const state = picomatch.parse('*.js');
|
||||||
|
// picomatch.compileRe(state[, options]);
|
||||||
|
|
||||||
|
console.log(picomatch.compileRe(state));
|
||||||
|
//=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.makeRe](lib/picomatch.js#L285)
|
||||||
|
|
||||||
|
Create a regular expression from a parsed glob pattern.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `state` **{String}**: The object returned from the `.parse` method.
|
||||||
|
* `options` **{Object}**
|
||||||
|
* `returnOutput` **{Boolean}**: Implementors may use this argument to return the compiled output, instead of a regular expression. This is not exposed on the options to prevent end-users from mutating the result.
|
||||||
|
* `returnState` **{Boolean}**: Implementors may use this argument to return the state from the parsed glob with the returned regular expression.
|
||||||
|
* `returns` **{RegExp}**: Returns a regex created from the given pattern.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.makeRe(state[, options]);
|
||||||
|
|
||||||
|
const result = picomatch.makeRe('*.js');
|
||||||
|
console.log(result);
|
||||||
|
//=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/
|
||||||
|
```
|
||||||
|
|
||||||
|
### [.toRegex](lib/picomatch.js#L320)
|
||||||
|
|
||||||
|
Create a regular expression from the given regex source string.
|
||||||
|
|
||||||
|
**Params**
|
||||||
|
|
||||||
|
* `source` **{String}**: Regular expression source string.
|
||||||
|
* `options` **{Object}**
|
||||||
|
* `returns` **{RegExp}**
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
// picomatch.toRegex(source[, options]);
|
||||||
|
|
||||||
|
const { output } = picomatch.parse('*.js');
|
||||||
|
console.log(picomatch.toRegex(output));
|
||||||
|
//=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Options
|
||||||
|
|
||||||
|
### Picomatch options
|
||||||
|
|
||||||
|
The following options may be used with the main `picomatch()` function or any of the methods on the picomatch API.
|
||||||
|
|
||||||
|
| **Option** | **Type** | **Default value** | **Description** |
|
||||||
|
| --- | --- | --- | --- |
|
||||||
|
| `basename` | `boolean` | `false` | If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. |
|
||||||
|
| `bash` | `boolean` | `false` | Follow bash matching rules more strictly - disallows backslashes as escape characters, and treats single stars as globstars (`**`). |
|
||||||
|
| `capture` | `boolean` | `undefined` | Return regex matches in supporting methods. |
|
||||||
|
| `contains` | `boolean` | `undefined` | Allows glob to match any part of the given string(s). |
|
||||||
|
| `debug` | `boolean` | `undefined` | Debug regular expressions when an error is thrown. |
|
||||||
|
| `dot` | `boolean` | `false` | Enable dotfile matching. By default, dotfiles are ignored unless a `.` is explicitly defined in the pattern, or `options.dot` is true |
|
||||||
|
| `expandRange` | `function` | `undefined` | Custom function for expanding ranges in brace patterns, such as `{a..z}`. The function receives the range values as two arguments, and it must return a string to be used in the generated regex. It's recommended that returned strings be wrapped in parentheses. |
|
||||||
|
| `fastpaths` | `boolean` | `true` | To speed up processing, full parsing is skipped for a handful common glob patterns. Disable this behavior by setting this option to `false`. |
|
||||||
|
| `flags` | `string` | `undefined` | Regex flags to use in the generated regex. If defined, the `nocase` option will be overridden. |
|
||||||
|
| [format](#optionsformat) | `function` | `undefined` | Custom function for formatting the returned string. This is useful for removing leading slashes, converting Windows paths to Posix paths, etc. |
|
||||||
|
| `ignore` | `array\|string` | `undefined` | One or more glob patterns for excluding strings that should not be matched from the result. |
|
||||||
|
| `keepQuotes` | `boolean` | `false` | Retain quotes in the generated regex, since quotes may also be used as an alternative to backslashes. |
|
||||||
|
| `literalBrackets` | `boolean` | `undefined` | When `true`, brackets in the glob pattern will be escaped so that only literal brackets will be matched. |
|
||||||
|
| `matchBase` | `boolean` | `false` | Alias for `basename` |
|
||||||
|
| `maxLength` | `number` | `65536` | Limit the max length of the input string. An error is thrown if the input string is longer than this value. |
|
||||||
|
| `maxExtglobRecursion` | `number\|boolean` | `0` | Limit nested quantified extglobs and other risky repeated extglob forms. When the limit is exceeded, the extglob is treated as a literal string instead of being compiled to regex. Set to `false` to disable this safeguard. |
|
||||||
|
| `nobrace` | `boolean` | `false` | Disable brace matching, so that `{a,b}` and `{1..3}` would be treated as literal characters. |
|
||||||
|
| `nobracket` | `boolean` | `undefined` | Disable matching with regex brackets. |
|
||||||
|
| `nocase` | `boolean` | `false` | Make matching case-insensitive. Equivalent to the regex `i` flag. Note that this option is overridden by the `flags` option. |
|
||||||
|
| `noext` | `boolean` | `false` | Alias for `noextglob` |
|
||||||
|
| `noextglob` | `boolean` | `false` | Disable support for matching with extglobs (like `+(a\|b)`) |
|
||||||
|
| `noglobstar` | `boolean` | `false` | Disable support for matching nested directories with globstars (`**`) |
|
||||||
|
| `nonegate` | `boolean` | `false` | Disable support for negating with leading `!` |
|
||||||
|
| [onIgnore](#optionsonIgnore) | `function` | `undefined` | Function to be called on ignored items. |
|
||||||
|
| [onMatch](#optionsonMatch) | `function` | `undefined` | Function to be called on matched items. |
|
||||||
|
| [onResult](#optionsonResult) | `function` | `undefined` | Function to be called on all items, regardless of whether or not they are matched or ignored. |
|
||||||
|
| `posix` | `boolean` | `false` | Support POSIX character classes ("posix brackets"). |
|
||||||
|
| `prepend` | `boolean` | `undefined` | String to prepend to the generated regex used for matching. |
|
||||||
|
| `regex` | `boolean` | `false` | Use regular expression rules for `+` (instead of matching literal `+`), and for stars that follow closing parentheses or brackets (as in `)*` and `]*`). |
|
||||||
|
| `strictBrackets` | `boolean` | `undefined` | Throw an error if brackets, braces, or parens are imbalanced. |
|
||||||
|
| `strictSlashes` | `boolean` | `undefined` | When true, picomatch won't match trailing slashes with single stars. |
|
||||||
|
| `unescape` | `boolean` | `undefined` | Remove backslashes preceding escaped characters in the glob pattern. By default, backslashes are retained. |
|
||||||
|
| `windows` | `boolean` | `false` | Also accept backslashes as the path separator. |
|
||||||
|
|
||||||
|
### Scan Options
|
||||||
|
|
||||||
|
In addition to the main [picomatch options](#picomatch-options), the following options may also be used with the [.scan](#scan) method.
|
||||||
|
|
||||||
|
| **Option** | **Type** | **Default value** | **Description** |
|
||||||
|
| --- | --- | --- | --- |
|
||||||
|
| `tokens` | `boolean` | `false` | When `true`, the returned object will include an array of tokens (objects), representing each path "segment" in the scanned glob pattern |
|
||||||
|
| `parts` | `boolean` | `false` | When `true`, the returned object will include an array of strings representing each path "segment" in the scanned glob pattern. This is automatically enabled when `options.tokens` is true |
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const picomatch = require('picomatch');
|
||||||
|
const result = picomatch.scan('!./foo/*.js', { tokens: true });
|
||||||
|
console.log(result);
|
||||||
|
// {
|
||||||
|
// prefix: '!./',
|
||||||
|
// input: '!./foo/*.js',
|
||||||
|
// start: 3,
|
||||||
|
// base: 'foo',
|
||||||
|
// glob: '*.js',
|
||||||
|
// isBrace: false,
|
||||||
|
// isBracket: false,
|
||||||
|
// isGlob: true,
|
||||||
|
// isExtglob: false,
|
||||||
|
// isGlobstar: false,
|
||||||
|
// negated: true,
|
||||||
|
// maxDepth: 2,
|
||||||
|
// tokens: [
|
||||||
|
// { value: '!./', depth: 0, isGlob: false, negated: true, isPrefix: true },
|
||||||
|
// { value: 'foo', depth: 1, isGlob: false },
|
||||||
|
// { value: '*.js', depth: 1, isGlob: true }
|
||||||
|
// ],
|
||||||
|
// slashes: [ 2, 6 ],
|
||||||
|
// parts: [ 'foo', '*.js' ]
|
||||||
|
// }
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
### Options Examples
|
||||||
|
|
||||||
|
#### options.expandRange
|
||||||
|
|
||||||
|
**Type**: `function`
|
||||||
|
|
||||||
|
**Default**: `undefined`
|
||||||
|
|
||||||
|
Custom function for expanding ranges in brace patterns. The [fill-range](https://github.com/jonschlinkert/fill-range) library is ideal for this purpose, or you can use custom code to do whatever you need.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
The following example shows how to create a glob that matches a folder
|
||||||
|
|
||||||
|
```js
|
||||||
|
const fill = require('fill-range');
|
||||||
|
const regex = pm.makeRe('foo/{01..25}/bar', {
|
||||||
|
expandRange(a, b) {
|
||||||
|
return `(${fill(a, b, { toRegex: true })})`;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(regex);
|
||||||
|
//=> /^(?:foo\/((?:0[1-9]|1[0-9]|2[0-5]))\/bar)$/
|
||||||
|
|
||||||
|
console.log(regex.test('foo/00/bar')) // false
|
||||||
|
console.log(regex.test('foo/01/bar')) // true
|
||||||
|
console.log(regex.test('foo/10/bar')) // true
|
||||||
|
console.log(regex.test('foo/22/bar')) // true
|
||||||
|
console.log(regex.test('foo/25/bar')) // true
|
||||||
|
console.log(regex.test('foo/26/bar')) // false
|
||||||
|
```
|
||||||
|
|
||||||
|
#### options.format
|
||||||
|
|
||||||
|
**Type**: `function`
|
||||||
|
|
||||||
|
**Default**: `undefined`
|
||||||
|
|
||||||
|
Custom function for formatting strings before they're matched.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
```js
|
||||||
|
// strip leading './' from strings
|
||||||
|
const format = str => str.replace(/^\.\//, '');
|
||||||
|
const isMatch = picomatch('foo/*.js', { format });
|
||||||
|
console.log(isMatch('./foo/bar.js')); //=> true
|
||||||
|
```
|
||||||
|
|
||||||
|
#### options.onMatch
|
||||||
|
|
||||||
|
```js
|
||||||
|
const onMatch = ({ glob, regex, input, output }) => {
|
||||||
|
console.log({ glob, regex, input, output });
|
||||||
|
};
|
||||||
|
|
||||||
|
const isMatch = picomatch('*', { onMatch });
|
||||||
|
isMatch('foo');
|
||||||
|
isMatch('bar');
|
||||||
|
isMatch('baz');
|
||||||
|
```
|
||||||
|
|
||||||
|
#### options.onIgnore
|
||||||
|
|
||||||
|
```js
|
||||||
|
const onIgnore = ({ glob, regex, input, output }) => {
|
||||||
|
console.log({ glob, regex, input, output });
|
||||||
|
};
|
||||||
|
|
||||||
|
const isMatch = picomatch('*', { onIgnore, ignore: 'f*' });
|
||||||
|
isMatch('foo');
|
||||||
|
isMatch('bar');
|
||||||
|
isMatch('baz');
|
||||||
|
```
|
||||||
|
|
||||||
|
#### options.onResult
|
||||||
|
|
||||||
|
```js
|
||||||
|
const onResult = ({ glob, regex, input, output }) => {
|
||||||
|
console.log({ glob, regex, input, output });
|
||||||
|
};
|
||||||
|
|
||||||
|
const isMatch = picomatch('*', { onResult, ignore: 'f*' });
|
||||||
|
isMatch('foo');
|
||||||
|
isMatch('bar');
|
||||||
|
isMatch('baz');
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Globbing features
|
||||||
|
|
||||||
|
* [Basic globbing](#basic-globbing) (Wildcard matching)
|
||||||
|
* [Advanced globbing](#advanced-globbing) (extglobs, posix brackets, brace matching)
|
||||||
|
|
||||||
|
### Basic globbing
|
||||||
|
|
||||||
|
| **Character** | **Description** |
|
||||||
|
| --- | --- |
|
||||||
|
| `*` | Matches any character zero or more times, excluding path separators. Does _not match_ path separators or hidden files or directories ("dotfiles"), unless explicitly enabled by setting the `dot` option to `true`. |
|
||||||
|
| `**` | Matches any character zero or more times, including path separators. Note that `**` will only match path separators (`/`, and `\\` with the `windows` option) when they are the only characters in a path segment. Thus, `foo**/bar` is equivalent to `foo*/bar`, and `foo/a**b/bar` is equivalent to `foo/a*b/bar`, and _more than two_ consecutive stars in a glob path segment are regarded as _a single star_. Thus, `foo/***/bar` is equivalent to `foo/*/bar`. |
|
||||||
|
| `?` | Matches any character excluding path separators one time. Does _not match_ path separators or leading dots. |
|
||||||
|
| `[abc]` | Matches any characters inside the brackets. For example, `[abc]` would match the characters `a`, `b` or `c`, and nothing else. |
|
||||||
|
|
||||||
|
#### Matching behavior vs. Bash
|
||||||
|
|
||||||
|
Picomatch's matching features and expected results in unit tests are based on Bash's unit tests and the Bash 4.3 specification, with the following exceptions:
|
||||||
|
|
||||||
|
* Bash will match `foo/bar/baz` with `*`. Picomatch only matches nested directories with `**`.
|
||||||
|
* Bash greedily matches with negated extglobs. For example, Bash 4.3 says that `!(foo)*` should match `foo` and `foobar`, since the trailing `*` bracktracks to match the preceding pattern. This is very memory-inefficient, and IMHO, also incorrect. Picomatch would return `false` for both `foo` and `foobar`.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
### Advanced globbing
|
||||||
|
|
||||||
|
* [extglobs](#extglobs)
|
||||||
|
* [POSIX brackets](#posix-brackets)
|
||||||
|
* [Braces](#brace-expansion)
|
||||||
|
|
||||||
|
#### Extglobs
|
||||||
|
|
||||||
|
| **Pattern** | **Description** |
|
||||||
|
| --- | --- |
|
||||||
|
| `@(pattern)` | Match _only one_ consecutive occurrence of `pattern` |
|
||||||
|
| `*(pattern)` | Match _zero or more_ consecutive occurrences of `pattern` |
|
||||||
|
| `+(pattern)` | Match _one or more_ consecutive occurrences of `pattern` |
|
||||||
|
| `?(pattern)` | Match _zero or **one**_ consecutive occurrences of `pattern` |
|
||||||
|
| `!(pattern)` | Match _anything but_ `pattern` |
|
||||||
|
|
||||||
|
**Examples**
|
||||||
|
|
||||||
|
```js
|
||||||
|
const pm = require('picomatch');
|
||||||
|
|
||||||
|
// *(pattern) matches ZERO or more of "pattern"
|
||||||
|
console.log(pm.isMatch('a', 'a*(z)')); // true
|
||||||
|
console.log(pm.isMatch('az', 'a*(z)')); // true
|
||||||
|
console.log(pm.isMatch('azzz', 'a*(z)')); // true
|
||||||
|
|
||||||
|
// +(pattern) matches ONE or more of "pattern"
|
||||||
|
console.log(pm.isMatch('a', 'a+(z)')); // false
|
||||||
|
console.log(pm.isMatch('az', 'a+(z)')); // true
|
||||||
|
console.log(pm.isMatch('azzz', 'a+(z)')); // true
|
||||||
|
|
||||||
|
// supports multiple extglobs
|
||||||
|
console.log(pm.isMatch('foo.bar', '!(foo).!(bar)')); // false
|
||||||
|
|
||||||
|
// supports nested extglobs
|
||||||
|
console.log(pm.isMatch('foo.bar', '!(!(foo)).!(!(bar))')); // true
|
||||||
|
|
||||||
|
// risky quantified extglobs are treated literally by default
|
||||||
|
console.log(pm.makeRe('+(a|aa)'));
|
||||||
|
//=> /^(?:\+\(a\|aa\))$/
|
||||||
|
|
||||||
|
// increase the limit to allow a small amount of nested quantified extglobs
|
||||||
|
console.log(pm.isMatch('aaa', '+(+(a))', { maxExtglobRecursion: 1 })); // true
|
||||||
|
```
|
||||||
|
|
||||||
|
#### POSIX brackets
|
||||||
|
|
||||||
|
POSIX classes are disabled by default. Enable this feature by setting the `posix` option to true.
|
||||||
|
|
||||||
|
**Enable POSIX bracket support**
|
||||||
|
|
||||||
|
```js
|
||||||
|
console.log(pm.makeRe('[[:word:]]+', { posix: true }));
|
||||||
|
//=> /^(?:(?=.)[A-Za-z0-9_]+\/?)$/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Supported POSIX classes**
|
||||||
|
|
||||||
|
The following named POSIX bracket expressions are supported:
|
||||||
|
|
||||||
|
* `[:alnum:]` - Alphanumeric characters, equ `[a-zA-Z0-9]`
|
||||||
|
* `[:alpha:]` - Alphabetical characters, equivalent to `[a-zA-Z]`.
|
||||||
|
* `[:ascii:]` - ASCII characters, equivalent to `[\\x00-\\x7F]`.
|
||||||
|
* `[:blank:]` - Space and tab characters, equivalent to `[ \\t]`.
|
||||||
|
* `[:cntrl:]` - Control characters, equivalent to `[\\x00-\\x1F\\x7F]`.
|
||||||
|
* `[:digit:]` - Numerical digits, equivalent to `[0-9]`.
|
||||||
|
* `[:graph:]` - Graph characters, equivalent to `[\\x21-\\x7E]`.
|
||||||
|
* `[:lower:]` - Lowercase letters, equivalent to `[a-z]`.
|
||||||
|
* `[:print:]` - Print characters, equivalent to `[\\x20-\\x7E ]`.
|
||||||
|
* `[:punct:]` - Punctuation and symbols, equivalent to `[\\-!"#$%&\'()\\*+,./:;<=>?@[\\]^_`{|}~]`.
|
||||||
|
* `[:space:]` - Extended space characters, equivalent to `[ \\t\\r\\n\\v\\f]`.
|
||||||
|
* `[:upper:]` - Uppercase letters, equivalent to `[A-Z]`.
|
||||||
|
* `[:word:]` - Word characters (letters, numbers and underscores), equivalent to `[A-Za-z0-9_]`.
|
||||||
|
* `[:xdigit:]` - Hexadecimal digits, equivalent to `[A-Fa-f0-9]`.
|
||||||
|
|
||||||
|
See the [Bash Reference Manual](https://www.gnu.org/software/bash/manual/html_node/Pattern-Matching.html) for more information.
|
||||||
|
|
||||||
|
### Braces
|
||||||
|
|
||||||
|
Picomatch only does [brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html) of comma-delimited lists (e.g. `a/{b,c}/d`). For advanced matching with braces, use [micromatch](https://github.com/micromatch/micromatch), which supports advanced syntax such as ranges (e.g. `{01..03}`) and increments (e.g. `{2..10..2}`).
|
||||||
|
|
||||||
|
### Matching special characters as literals
|
||||||
|
|
||||||
|
If you wish to match the following special characters in a filepath, and you want to use these characters in your glob pattern, they must be escaped with backslashes or quotes:
|
||||||
|
|
||||||
|
**Special Characters**
|
||||||
|
|
||||||
|
Some characters that are used for matching in regular expressions are also regarded as valid file path characters on some platforms.
|
||||||
|
|
||||||
|
To match any of the following characters as literals: `$^*+?()[]
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
```js
|
||||||
|
console.log(pm.makeRe('foo/bar \\(1\\)'));
|
||||||
|
console.log(pm.makeRe('foo/bar \\(1\\)'));
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Library Comparisons
|
||||||
|
|
||||||
|
The following table shows which features are supported by [minimatch](https://github.com/isaacs/minimatch), [micromatch](https://github.com/micromatch/micromatch), [picomatch](https://github.com/micromatch/picomatch), [nanomatch](https://github.com/micromatch/nanomatch), [extglob](https://github.com/micromatch/extglob), [braces](https://github.com/micromatch/braces), and [expand-brackets](https://github.com/micromatch/expand-brackets).
|
||||||
|
|
||||||
|
| **Feature** | `minimatch` | `micromatch` | `picomatch` | `nanomatch` | `extglob` | `braces` | `expand-brackets` |
|
||||||
|
| --- | --- | --- | --- | --- | --- | --- | --- |
|
||||||
|
| Wildcard matching (`*?+`) | ✔ | ✔ | ✔ | ✔ | - | - | - |
|
||||||
|
| Advancing globbing | ✔ | ✔ | ✔ | - | - | - | - |
|
||||||
|
| Brace _matching_ | ✔ | ✔ | ✔ | - | - | ✔ | - |
|
||||||
|
| Brace _expansion_ | ✔ | ✔ | - | - | - | ✔ | - |
|
||||||
|
| Extglobs | partial | ✔ | ✔ | - | ✔ | - | - |
|
||||||
|
| Posix brackets | - | ✔ | ✔ | - | - | - | ✔ |
|
||||||
|
| Regular expression syntax | - | ✔ | ✔ | ✔ | ✔ | - | ✔ |
|
||||||
|
| File system operations | - | - | - | - | - | - | - |
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Benchmarks
|
||||||
|
|
||||||
|
Performance comparison of picomatch and minimatch.
|
||||||
|
|
||||||
|
```
|
||||||
|
# .makeRe star (*)
|
||||||
|
picomatch x 3,251,247 ops/sec ±0.25% (95 runs sampled)
|
||||||
|
minimatch x 497,224 ops/sec ±0.11% (100 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe star; dot=true (*)
|
||||||
|
picomatch x 2,624,035 ops/sec ±0.16% (98 runs sampled)
|
||||||
|
minimatch x 446,244 ops/sec ±0.63% (99 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe globstar (**)
|
||||||
|
picomatch x 2,524,465 ops/sec ±0.13% (99 runs sampled)
|
||||||
|
minimatch x 1,396,257 ops/sec ±0.58% (96 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe globstars (**/**/**)
|
||||||
|
picomatch x 2,545,674 ops/sec ±0.10% (99 runs sampled)
|
||||||
|
minimatch x 1,196,835 ops/sec ±0.63% (98 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe with leading star (*.txt)
|
||||||
|
picomatch x 2,537,708 ops/sec ±0.11% (100 runs sampled)
|
||||||
|
minimatch x 345,284 ops/sec ±0.64% (96 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe - basic braces ({a,b,c}*.txt)
|
||||||
|
picomatch x 505,430 ops/sec ±1.04% (94 runs sampled)
|
||||||
|
minimatch x 107,991 ops/sec ±0.54% (99 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe - short ranges ({a..z}*.txt)
|
||||||
|
picomatch x 371,179 ops/sec ±2.91% (77 runs sampled)
|
||||||
|
minimatch x 14,104 ops/sec ±0.61% (99 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe - medium ranges ({1..100000}*.txt)
|
||||||
|
picomatch x 384,958 ops/sec ±1.70% (82 runs sampled)
|
||||||
|
minimatch x 2.55 ops/sec ±3.22% (11 runs sampled)
|
||||||
|
|
||||||
|
# .makeRe - long ranges ({1..10000000}*.txt)
|
||||||
|
picomatch x 382,552 ops/sec ±1.52% (71 runs sampled)
|
||||||
|
minimatch x 0.83 ops/sec ±5.67% (7 runs sampled))
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## Philosophies
|
||||||
|
|
||||||
|
The goal of this library is to be blazing fast, without compromising on accuracy.
|
||||||
|
|
||||||
|
**Accuracy**
|
||||||
|
|
||||||
|
The number one of goal of this library is accuracy. However, it's not unusual for different glob implementations to have different rules for matching behavior, even with simple wildcard matching. It gets increasingly more complicated when combinations of different features are combined, like when extglobs are combined with globstars, braces, slashes, and so on: `!(**/{a,b,*/c})`.
|
||||||
|
|
||||||
|
Thus, given that there is no canonical glob specification to use as a single source of truth when differences of opinion arise regarding behavior, sometimes we have to implement our best judgement and rely on feedback from users to make improvements.
|
||||||
|
|
||||||
|
**Performance**
|
||||||
|
|
||||||
|
Although this library performs well in benchmarks, and in most cases it's faster than other popular libraries we benchmarked against, we will always choose accuracy over performance. It's not helpful to anyone if our library is faster at returning the wrong answer.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
## About
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary><strong>Contributing</strong></summary>
|
||||||
|
|
||||||
|
Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new).
|
||||||
|
|
||||||
|
Please read the [contributing guide](.github/contributing.md) for advice on opening issues, pull requests, and coding standards.
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary><strong>Running Tests</strong></summary>
|
||||||
|
|
||||||
|
Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install && npm test
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary><strong>Building docs</strong></summary>
|
||||||
|
|
||||||
|
_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_
|
||||||
|
|
||||||
|
To generate the readme, run the following command:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install -g verbose/verb#dev verb-generate-readme && verb
|
||||||
|
```
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
### Author
|
||||||
|
|
||||||
|
**Jon Schlinkert**
|
||||||
|
|
||||||
|
* [GitHub Profile](https://github.com/jonschlinkert)
|
||||||
|
* [Twitter Profile](https://twitter.com/jonschlinkert)
|
||||||
|
* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert)
|
||||||
|
|
||||||
|
### License
|
||||||
|
|
||||||
|
Copyright © 2017-present, [Jon Schlinkert](https://github.com/jonschlinkert).
|
||||||
|
Released under the [MIT License](LICENSE).
|
||||||
@ -0,0 +1,49 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { Unsubscribable, ObservableInput, ObservedValueOf } from '../types';
|
||||||
|
import { innerFrom } from './innerFrom';
|
||||||
|
import { EMPTY } from './empty';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates an Observable that uses a resource which will be disposed at the same time as the Observable.
|
||||||
|
*
|
||||||
|
* <span class="informal">Use it when you catch yourself cleaning up after an Observable.</span>
|
||||||
|
*
|
||||||
|
* `using` is a factory operator, which accepts two functions. First function returns a disposable resource.
|
||||||
|
* It can be an arbitrary object that implements `unsubscribe` method. Second function will be injected with
|
||||||
|
* that object and should return an Observable. That Observable can use resource object during its execution.
|
||||||
|
* Both functions passed to `using` will be called every time someone subscribes - neither an Observable nor
|
||||||
|
* resource object will be shared in any way between subscriptions.
|
||||||
|
*
|
||||||
|
* When Observable returned by `using` is subscribed, Observable returned from the second function will be subscribed
|
||||||
|
* as well. All its notifications (nexted values, completion and error events) will be emitted unchanged by the output
|
||||||
|
* Observable. If however someone unsubscribes from the Observable or source Observable completes or errors by itself,
|
||||||
|
* the `unsubscribe` method on resource object will be called. This can be used to do any necessary clean up, which
|
||||||
|
* otherwise would have to be handled by hand. Note that complete or error notifications are not emitted when someone
|
||||||
|
* cancels subscription to an Observable via `unsubscribe`, so `using` can be used as a hook, allowing you to make
|
||||||
|
* sure that all resources which need to exist during an Observable execution will be disposed at appropriate time.
|
||||||
|
*
|
||||||
|
* @see {@link defer}
|
||||||
|
*
|
||||||
|
* @param resourceFactory A function which creates any resource object that implements `unsubscribe` method.
|
||||||
|
* @param observableFactory A function which creates an Observable, that can use injected resource object.
|
||||||
|
* @return An Observable that behaves the same as Observable returned by `observableFactory`, but
|
||||||
|
* which - when completed, errored or unsubscribed - will also call `unsubscribe` on created resource object.
|
||||||
|
*/
|
||||||
|
export function using<T extends ObservableInput<any>>(
|
||||||
|
resourceFactory: () => Unsubscribable | void,
|
||||||
|
observableFactory: (resource: Unsubscribable | void) => T | void
|
||||||
|
): Observable<ObservedValueOf<T>> {
|
||||||
|
return new Observable<ObservedValueOf<T>>((subscriber) => {
|
||||||
|
const resource = resourceFactory();
|
||||||
|
const result = observableFactory(resource);
|
||||||
|
const source = result ? innerFrom(result) : EMPTY;
|
||||||
|
source.subscribe(subscriber);
|
||||||
|
return () => {
|
||||||
|
// NOTE: Optional chaining did not work here.
|
||||||
|
// Related TS Issue: https://github.com/microsoft/TypeScript/issues/40818
|
||||||
|
if (resource) {
|
||||||
|
resource.unsubscribe();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
@ -0,0 +1,46 @@
|
|||||||
|
import { PGlite, type PGliteOptions } from '@electric-sql/pglite';
|
||||||
|
import type { Cache } from "../cache/core/cache.cjs";
|
||||||
|
import { entityKind } from "../entity.cjs";
|
||||||
|
import type { Logger } from "../logger.cjs";
|
||||||
|
import { PgDatabase } from "../pg-core/db.cjs";
|
||||||
|
import { PgDialect } from "../pg-core/dialect.cjs";
|
||||||
|
import { type RelationalSchemaConfig, type TablesRelationalConfig } from "../relations.cjs";
|
||||||
|
import { type DrizzleConfig } from "../utils.cjs";
|
||||||
|
import type { PgliteClient, PgliteQueryResultHKT } from "./session.cjs";
|
||||||
|
import { PgliteSession } from "./session.cjs";
|
||||||
|
export interface PgDriverOptions {
|
||||||
|
logger?: Logger;
|
||||||
|
cache?: Cache;
|
||||||
|
}
|
||||||
|
export declare class PgliteDriver {
|
||||||
|
private client;
|
||||||
|
private dialect;
|
||||||
|
private options;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(client: PgliteClient, dialect: PgDialect, options?: PgDriverOptions);
|
||||||
|
createSession(schema: RelationalSchemaConfig<TablesRelationalConfig> | undefined): PgliteSession<Record<string, unknown>, TablesRelationalConfig>;
|
||||||
|
}
|
||||||
|
export declare class PgliteDatabase<TSchema extends Record<string, unknown> = Record<string, never>> extends PgDatabase<PgliteQueryResultHKT, TSchema> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
}
|
||||||
|
export declare function drizzle<TSchema extends Record<string, unknown> = Record<string, never>, TClient extends PGlite = PGlite>(...params: [] | [
|
||||||
|
TClient | string
|
||||||
|
] | [
|
||||||
|
TClient | string,
|
||||||
|
DrizzleConfig<TSchema>
|
||||||
|
] | [
|
||||||
|
(DrizzleConfig<TSchema> & ({
|
||||||
|
connection?: (PGliteOptions & {
|
||||||
|
dataDir?: string;
|
||||||
|
}) | string;
|
||||||
|
} | {
|
||||||
|
client: TClient;
|
||||||
|
}))
|
||||||
|
]): PgliteDatabase<TSchema> & {
|
||||||
|
$client: TClient;
|
||||||
|
};
|
||||||
|
export declare namespace drizzle {
|
||||||
|
function mock<TSchema extends Record<string, unknown> = Record<string, never>>(config?: DrizzleConfig<TSchema>): PgliteDatabase<TSchema> & {
|
||||||
|
$client: '$client is not available on drizzle.mock()';
|
||||||
|
};
|
||||||
|
}
|
||||||
@ -0,0 +1,58 @@
|
|||||||
|
import type { ColumnBuilderBaseConfig } from "../../../column-builder.js";
|
||||||
|
import type { ColumnBaseConfig } from "../../../column.js";
|
||||||
|
import { entityKind } from "../../../entity.js";
|
||||||
|
import { type Equal } from "../../../utils.js";
|
||||||
|
import { PgColumn, PgColumnBuilder } from "../common.js";
|
||||||
|
export type PgGeometryBuilderInitial<TName extends string> = PgGeometryBuilder<{
|
||||||
|
name: TName;
|
||||||
|
dataType: 'array';
|
||||||
|
columnType: 'PgGeometry';
|
||||||
|
data: [number, number];
|
||||||
|
driverParam: string;
|
||||||
|
enumValues: undefined;
|
||||||
|
}>;
|
||||||
|
export declare class PgGeometryBuilder<T extends ColumnBuilderBaseConfig<'array', 'PgGeometry'>> extends PgColumnBuilder<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name']);
|
||||||
|
}
|
||||||
|
export declare class PgGeometry<T extends ColumnBaseConfig<'array', 'PgGeometry'>> extends PgColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
getSQLType(): string;
|
||||||
|
mapFromDriverValue(value: string): [number, number];
|
||||||
|
mapToDriverValue(value: [number, number]): string;
|
||||||
|
}
|
||||||
|
export type PgGeometryObjectBuilderInitial<TName extends string> = PgGeometryObjectBuilder<{
|
||||||
|
name: TName;
|
||||||
|
dataType: 'json';
|
||||||
|
columnType: 'PgGeometryObject';
|
||||||
|
data: {
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
};
|
||||||
|
driverParam: string;
|
||||||
|
enumValues: undefined;
|
||||||
|
}>;
|
||||||
|
export declare class PgGeometryObjectBuilder<T extends ColumnBuilderBaseConfig<'json', 'PgGeometryObject'>> extends PgColumnBuilder<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name']);
|
||||||
|
}
|
||||||
|
export declare class PgGeometryObject<T extends ColumnBaseConfig<'json', 'PgGeometryObject'>> extends PgColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
getSQLType(): string;
|
||||||
|
mapFromDriverValue(value: string): {
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
};
|
||||||
|
mapToDriverValue(value: {
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
}): string;
|
||||||
|
}
|
||||||
|
export interface PgGeometryConfig<T extends 'tuple' | 'xy' = 'tuple' | 'xy'> {
|
||||||
|
mode?: T;
|
||||||
|
type?: 'point' | (string & {});
|
||||||
|
srid?: number;
|
||||||
|
}
|
||||||
|
export declare function geometry(): PgGeometryBuilderInitial<''>;
|
||||||
|
export declare function geometry<TMode extends PgGeometryConfig['mode'] & {}>(config?: PgGeometryConfig<TMode>): Equal<TMode, 'xy'> extends true ? PgGeometryObjectBuilderInitial<''> : PgGeometryBuilderInitial<''>;
|
||||||
|
export declare function geometry<TName extends string, TMode extends PgGeometryConfig['mode'] & {}>(name: TName, config?: PgGeometryConfig<TMode>): Equal<TMode, 'xy'> extends true ? PgGeometryObjectBuilderInitial<TName> : PgGeometryBuilderInitial<TName>;
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../src/tidb-serverless/migrator.ts"],"sourcesContent":["import type { MigrationConfig } from '~/migrator.ts';\nimport { readMigrationFiles } from '~/migrator.ts';\nimport type { TiDBServerlessDatabase } from './driver.ts';\n\nexport async function migrate<TSchema extends Record<string, unknown>>(\n\tdb: TiDBServerlessDatabase<TSchema>,\n\tconfig: MigrationConfig,\n) {\n\tconst migrations = readMigrationFiles(config);\n\tawait db.dialect.migrate(migrations, db.session, config);\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AACA,sBAAmC;AAGnC,eAAsB,QACrB,IACA,QACC;AACD,QAAM,iBAAa,oCAAmB,MAAM;AAC5C,QAAM,GAAG,QAAQ,QAAQ,YAAY,GAAG,SAAS,MAAM;AACxD;","names":[]}
|
||||||
File diff suppressed because one or more lines are too long
@ -0,0 +1,9 @@
|
|||||||
|
import { entityKind } from "../entity.cjs";
|
||||||
|
import type { ColumnsSelection } from "../sql/sql.cjs";
|
||||||
|
import { View } from "../sql/sql.cjs";
|
||||||
|
export declare abstract class MySqlViewBase<TName extends string = string, TExisting extends boolean = boolean, TSelectedFields extends ColumnsSelection = ColumnsSelection> extends View<TName, TExisting, TSelectedFields> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
readonly _: View<TName, TExisting, TSelectedFields>['_'] & {
|
||||||
|
readonly viewBrand: 'MySqlViewBase';
|
||||||
|
};
|
||||||
|
}
|
||||||
@ -0,0 +1,22 @@
|
|||||||
|
import { entityKind } from "../entity.js";
|
||||||
|
import type { SQL } from "../sql/sql.js";
|
||||||
|
import type { SQLiteTable } from "./table.js";
|
||||||
|
export declare class CheckBuilder {
|
||||||
|
name: string;
|
||||||
|
value: SQL;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
protected brand: 'SQLiteConstraintBuilder';
|
||||||
|
constructor(name: string, value: SQL);
|
||||||
|
build(table: SQLiteTable): Check;
|
||||||
|
}
|
||||||
|
export declare class Check {
|
||||||
|
table: SQLiteTable;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
_: {
|
||||||
|
brand: 'SQLiteCheck';
|
||||||
|
};
|
||||||
|
readonly name: string;
|
||||||
|
readonly value: SQL;
|
||||||
|
constructor(table: SQLiteTable, builder: CheckBuilder);
|
||||||
|
}
|
||||||
|
export declare function check(name: string, value: SQL): CheckBuilder;
|
||||||
@ -0,0 +1,33 @@
|
|||||||
|
type Key = string | number | symbol;
|
||||||
|
/**
|
||||||
|
* SetArray acts like a `Set` (allowing only one occurrence of a string `key`), but provides the
|
||||||
|
* index of the `key` in the backing array.
|
||||||
|
*
|
||||||
|
* This is designed to allow synchronizing a second array with the contents of the backing array,
|
||||||
|
* like how in a sourcemap `sourcesContent[i]` is the source content associated with `source[i]`,
|
||||||
|
* and there are never duplicates.
|
||||||
|
*/
|
||||||
|
export declare class SetArray<T extends Key = Key> {
|
||||||
|
private _indexes;
|
||||||
|
array: readonly T[];
|
||||||
|
constructor();
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Gets the index associated with `key` in the backing array, if it is already present.
|
||||||
|
*/
|
||||||
|
export declare function get<T extends Key>(setarr: SetArray<T>, key: T): number | undefined;
|
||||||
|
/**
|
||||||
|
* Puts `key` into the backing array, if it is not already present. Returns
|
||||||
|
* the index of the `key` in the backing array.
|
||||||
|
*/
|
||||||
|
export declare function put<T extends Key>(setarr: SetArray<T>, key: T): number;
|
||||||
|
/**
|
||||||
|
* Pops the last added item out of the SetArray.
|
||||||
|
*/
|
||||||
|
export declare function pop<T extends Key>(setarr: SetArray<T>): void;
|
||||||
|
/**
|
||||||
|
* Removes the key, if it exists in the set.
|
||||||
|
*/
|
||||||
|
export declare function remove<T extends Key>(setarr: SetArray<T>, key: T): void;
|
||||||
|
export {};
|
||||||
|
//# sourceMappingURL=set-array.d.ts.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"bufferToggle.js","sourceRoot":"","sources":["../../../../src/internal/operators/bufferToggle.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,YAAY,EAAE,MAAM,iBAAiB,CAAC;AAE/C,OAAO,EAAE,OAAO,EAAE,MAAM,cAAc,CAAC;AACvC,OAAO,EAAE,SAAS,EAAE,MAAM,yBAAyB,CAAC;AACpD,OAAO,EAAE,wBAAwB,EAAE,MAAM,sBAAsB,CAAC;AAChE,OAAO,EAAE,IAAI,EAAE,MAAM,cAAc,CAAC;AACpC,OAAO,EAAE,SAAS,EAAE,MAAM,mBAAmB,CAAC;AA6C9C,MAAM,UAAU,YAAY,CAC1B,QAA4B,EAC5B,eAAmD;IAEnD,OAAO,OAAO,CAAC,CAAC,MAAM,EAAE,UAAU,EAAE,EAAE;QACpC,MAAM,OAAO,GAAU,EAAE,CAAC;QAG1B,SAAS,CAAC,QAAQ,CAAC,CAAC,SAAS,CAC3B,wBAAwB,CACtB,UAAU,EACV,CAAC,SAAS,EAAE,EAAE;YACZ,MAAM,MAAM,GAAQ,EAAE,CAAC;YACvB,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;YAGrB,MAAM,mBAAmB,GAAG,IAAI,YAAY,EAAE,CAAC;YAE/C,MAAM,UAAU,GAAG,GAAG,EAAE;gBACtB,SAAS,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;gBAC3B,UAAU,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;gBACxB,mBAAmB,CAAC,WAAW,EAAE,CAAC;YACpC,CAAC,CAAC;YAGF,mBAAmB,CAAC,GAAG,CAAC,SAAS,CAAC,eAAe,CAAC,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC,wBAAwB,CAAC,UAAU,EAAE,UAAU,EAAE,IAAI,CAAC,CAAC,CAAC,CAAC;QACnI,CAAC,EACD,IAAI,CACL,CACF,CAAC;QAEF,MAAM,CAAC,SAAS,CACd,wBAAwB,CACtB,UAAU,EACV,CAAC,KAAK,EAAE,EAAE;YAER,KAAK,MAAM,MAAM,IAAI,OAAO,EAAE;gBAC5B,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;aACpB;QACH,CAAC,EACD,GAAG,EAAE;YAEH,OAAO,OAAO,CAAC,MAAM,GAAG,CAAC,EAAE;gBACzB,UAAU,CAAC,IAAI,CAAC,OAAO,CAAC,KAAK,EAAG,CAAC,CAAC;aACnC;YACD,UAAU,CAAC,QAAQ,EAAE,CAAC;QACxB,CAAC,CACF,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC"}
|
||||||
@ -0,0 +1,25 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __reExport = (target, mod, secondTarget) => (__copyProps(target, mod, "default"), secondTarget && __copyProps(secondTarget, mod, "default"));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var sqlite_exports = {};
|
||||||
|
module.exports = __toCommonJS(sqlite_exports);
|
||||||
|
__reExport(sqlite_exports, require("./driver.cjs"), module.exports);
|
||||||
|
__reExport(sqlite_exports, require("./session.cjs"), module.exports);
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
...require("./driver.cjs"),
|
||||||
|
...require("./session.cjs")
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=index.cjs.map
|
||||||
@ -0,0 +1,4 @@
|
|||||||
|
import index from './index.js';
|
||||||
|
|
||||||
|
const { transform, transformStyleAttribute, bundle, bundleAsync, browserslistToTargets, composeVisitors, Features } = index;
|
||||||
|
export { transform, transformStyleAttribute, bundle, bundleAsync, browserslistToTargets, composeVisitors, Features };
|
||||||
@ -0,0 +1,244 @@
|
|||||||
|
"use strict";
|
||||||
|
var __create = Object.create;
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __getProtoOf = Object.getPrototypeOf;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||||
|
// If the importer is in node compatibility mode or this is not an ESM
|
||||||
|
// file that has been converted to a CommonJS file using a Babel-
|
||||||
|
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||||
|
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||||
|
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||||
|
mod
|
||||||
|
));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var traceViewer_exports = {};
|
||||||
|
__export(traceViewer_exports, {
|
||||||
|
installRootRedirect: () => installRootRedirect,
|
||||||
|
openTraceInBrowser: () => openTraceInBrowser,
|
||||||
|
openTraceViewerApp: () => openTraceViewerApp,
|
||||||
|
runTraceInBrowser: () => runTraceInBrowser,
|
||||||
|
runTraceViewerApp: () => runTraceViewerApp,
|
||||||
|
startTraceViewerServer: () => startTraceViewerServer
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(traceViewer_exports);
|
||||||
|
var import_fs = __toESM(require("fs"));
|
||||||
|
var import_path = __toESM(require("path"));
|
||||||
|
var import_utils = require("../../../utils");
|
||||||
|
var import_utils2 = require("../../../utils");
|
||||||
|
var import_httpServer = require("../../utils/httpServer");
|
||||||
|
var import_utilsBundle = require("../../../utilsBundle");
|
||||||
|
var import_launchApp = require("../../launchApp");
|
||||||
|
var import_launchApp2 = require("../../launchApp");
|
||||||
|
var import_playwright = require("../../playwright");
|
||||||
|
var import_progress = require("../../progress");
|
||||||
|
const tracesDirMarker = "traces.dir";
|
||||||
|
function validateTraceUrlOrPath(traceFileOrUrl) {
|
||||||
|
if (!traceFileOrUrl)
|
||||||
|
return traceFileOrUrl;
|
||||||
|
if (traceFileOrUrl.startsWith("http://") || traceFileOrUrl.startsWith("https://"))
|
||||||
|
return traceFileOrUrl;
|
||||||
|
let traceFile = traceFileOrUrl;
|
||||||
|
if (traceFile.endsWith(".json"))
|
||||||
|
return toFilePathUrl(traceFile);
|
||||||
|
try {
|
||||||
|
const stat = import_fs.default.statSync(traceFile);
|
||||||
|
if (stat.isDirectory())
|
||||||
|
traceFile = import_path.default.join(traceFile, tracesDirMarker);
|
||||||
|
return toFilePathUrl(traceFile);
|
||||||
|
} catch {
|
||||||
|
throw new Error(`Trace file ${traceFileOrUrl} does not exist!`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function startTraceViewerServer(options) {
|
||||||
|
const server = new import_httpServer.HttpServer();
|
||||||
|
server.routePrefix("/trace", (request, response) => {
|
||||||
|
const url = new URL("http://localhost" + request.url);
|
||||||
|
const relativePath = url.pathname.slice("/trace".length);
|
||||||
|
if (relativePath.startsWith("/file")) {
|
||||||
|
try {
|
||||||
|
const filePath = url.searchParams.get("path");
|
||||||
|
if (import_fs.default.existsSync(filePath))
|
||||||
|
return server.serveFile(request, response, url.searchParams.get("path"));
|
||||||
|
if (filePath.endsWith(".json")) {
|
||||||
|
const fullPrefix = filePath.substring(0, filePath.length - ".json".length);
|
||||||
|
return sendTraceDescriptor(response, import_path.default.dirname(fullPrefix), import_path.default.basename(fullPrefix));
|
||||||
|
}
|
||||||
|
if (filePath.endsWith(tracesDirMarker))
|
||||||
|
return sendTraceDescriptor(response, import_path.default.dirname(filePath));
|
||||||
|
} catch {
|
||||||
|
}
|
||||||
|
response.statusCode = 404;
|
||||||
|
response.end();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
const absolutePath = import_path.default.join(__dirname, "..", "..", "..", "vite", "traceViewer", ...relativePath.split("/"));
|
||||||
|
return server.serveFile(request, response, absolutePath);
|
||||||
|
});
|
||||||
|
const transport = options?.transport || (options?.isServer ? new StdinServer() : void 0);
|
||||||
|
if (transport)
|
||||||
|
server.createWebSocket(() => transport);
|
||||||
|
const { host, port } = options || {};
|
||||||
|
await server.start({ preferredPort: port, host });
|
||||||
|
return server;
|
||||||
|
}
|
||||||
|
async function installRootRedirect(server, traceUrl, options) {
|
||||||
|
const params = new URLSearchParams();
|
||||||
|
if (import_path.default.sep !== import_path.default.posix.sep)
|
||||||
|
params.set("pathSeparator", import_path.default.sep);
|
||||||
|
if (traceUrl)
|
||||||
|
params.append("trace", traceUrl);
|
||||||
|
if (server.wsGuid())
|
||||||
|
params.append("ws", server.wsGuid());
|
||||||
|
if (options?.isServer)
|
||||||
|
params.append("isServer", "");
|
||||||
|
if ((0, import_utils2.isUnderTest)())
|
||||||
|
params.append("isUnderTest", "true");
|
||||||
|
for (const arg of options.args || [])
|
||||||
|
params.append("arg", arg);
|
||||||
|
if (options.grep)
|
||||||
|
params.append("grep", options.grep);
|
||||||
|
if (options.grepInvert)
|
||||||
|
params.append("grepInvert", options.grepInvert);
|
||||||
|
for (const project of options.project || [])
|
||||||
|
params.append("project", project);
|
||||||
|
for (const reporter of options.reporter || [])
|
||||||
|
params.append("reporter", reporter);
|
||||||
|
const urlPath = `./trace/${options.webApp || "index.html"}?${params.toString()}`;
|
||||||
|
server.routePath("/", (_, response) => {
|
||||||
|
response.statusCode = 302;
|
||||||
|
response.setHeader("Location", urlPath);
|
||||||
|
response.end();
|
||||||
|
return true;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async function runTraceViewerApp(traceUrl, browserName, options) {
|
||||||
|
traceUrl = validateTraceUrlOrPath(traceUrl);
|
||||||
|
const server = await startTraceViewerServer(options);
|
||||||
|
await installRootRedirect(server, traceUrl, options);
|
||||||
|
const page = await openTraceViewerApp(server.urlPrefix("precise"), browserName, options);
|
||||||
|
page.on("close", () => (0, import_utils.gracefullyProcessExitDoNotHang)(0));
|
||||||
|
return page;
|
||||||
|
}
|
||||||
|
async function runTraceInBrowser(traceUrl, options) {
|
||||||
|
traceUrl = validateTraceUrlOrPath(traceUrl);
|
||||||
|
const server = await startTraceViewerServer(options);
|
||||||
|
await installRootRedirect(server, traceUrl, options);
|
||||||
|
await openTraceInBrowser(server.urlPrefix("human-readable"));
|
||||||
|
}
|
||||||
|
async function openTraceViewerApp(url, browserName, options) {
|
||||||
|
const traceViewerPlaywright = (0, import_playwright.createPlaywright)({ sdkLanguage: "javascript", isInternalPlaywright: true });
|
||||||
|
const traceViewerBrowser = (0, import_utils2.isUnderTest)() ? "chromium" : browserName;
|
||||||
|
const { context, page } = await (0, import_launchApp2.launchApp)(traceViewerPlaywright[traceViewerBrowser], {
|
||||||
|
sdkLanguage: traceViewerPlaywright.options.sdkLanguage,
|
||||||
|
windowSize: { width: 1280, height: 800 },
|
||||||
|
persistentContextOptions: {
|
||||||
|
...options?.persistentContextOptions,
|
||||||
|
cdpPort: (0, import_utils2.isUnderTest)() ? 0 : void 0,
|
||||||
|
headless: !!options?.headless,
|
||||||
|
colorScheme: (0, import_utils2.isUnderTest)() ? "light" : void 0
|
||||||
|
}
|
||||||
|
});
|
||||||
|
const controller = new import_progress.ProgressController();
|
||||||
|
await controller.run(async (progress) => {
|
||||||
|
await context._browser._defaultContext._loadDefaultContextAsIs(progress);
|
||||||
|
if (process.env.PWTEST_PRINT_WS_ENDPOINT) {
|
||||||
|
process.stderr.write("DevTools listening on: " + context._browser.options.wsEndpoint + "\n");
|
||||||
|
}
|
||||||
|
if (!(0, import_utils2.isUnderTest)())
|
||||||
|
await (0, import_launchApp.syncLocalStorageWithSettings)(page, "traceviewer");
|
||||||
|
if ((0, import_utils2.isUnderTest)())
|
||||||
|
page.on("close", () => context.close({ reason: "Trace viewer closed" }).catch(() => {
|
||||||
|
}));
|
||||||
|
await page.mainFrame().goto(progress, url);
|
||||||
|
});
|
||||||
|
return page;
|
||||||
|
}
|
||||||
|
async function openTraceInBrowser(url) {
|
||||||
|
console.log("\nListening on " + url);
|
||||||
|
if (!(0, import_utils2.isUnderTest)())
|
||||||
|
await (0, import_utilsBundle.open)(url.replace("0.0.0.0", "localhost")).catch(() => {
|
||||||
|
});
|
||||||
|
}
|
||||||
|
class StdinServer {
|
||||||
|
constructor() {
|
||||||
|
process.stdin.on("data", (data) => {
|
||||||
|
const url = validateTraceUrlOrPath(data.toString().trim());
|
||||||
|
if (!url || url === this._traceUrl)
|
||||||
|
return;
|
||||||
|
if (url.endsWith(".json"))
|
||||||
|
this._pollLoadTrace(url);
|
||||||
|
else
|
||||||
|
this._loadTrace(url);
|
||||||
|
});
|
||||||
|
process.stdin.on("close", () => (0, import_utils.gracefullyProcessExitDoNotHang)(0));
|
||||||
|
}
|
||||||
|
onconnect() {
|
||||||
|
}
|
||||||
|
async dispatch(method, params) {
|
||||||
|
if (method === "initialize") {
|
||||||
|
if (this._traceUrl)
|
||||||
|
this._loadTrace(this._traceUrl);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
onclose() {
|
||||||
|
}
|
||||||
|
_loadTrace(traceUrl) {
|
||||||
|
this._traceUrl = traceUrl;
|
||||||
|
clearTimeout(this._pollTimer);
|
||||||
|
this.sendEvent?.("loadTraceRequested", { traceUrl });
|
||||||
|
}
|
||||||
|
_pollLoadTrace(url) {
|
||||||
|
this._loadTrace(url);
|
||||||
|
this._pollTimer = setTimeout(() => {
|
||||||
|
this._pollLoadTrace(url);
|
||||||
|
}, 500);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function sendTraceDescriptor(response, traceDir, tracePrefix) {
|
||||||
|
response.statusCode = 200;
|
||||||
|
response.setHeader("Content-Type", "application/json");
|
||||||
|
response.end(JSON.stringify(traceDescriptor(traceDir, tracePrefix)));
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
function traceDescriptor(traceDir, tracePrefix) {
|
||||||
|
const result = {
|
||||||
|
entries: []
|
||||||
|
};
|
||||||
|
for (const name of import_fs.default.readdirSync(traceDir)) {
|
||||||
|
if (!tracePrefix || name.startsWith(tracePrefix))
|
||||||
|
result.entries.push({ name, path: toFilePathUrl(import_path.default.join(traceDir, name)) });
|
||||||
|
}
|
||||||
|
const resourcesDir = import_path.default.join(traceDir, "resources");
|
||||||
|
if (import_fs.default.existsSync(resourcesDir)) {
|
||||||
|
for (const name of import_fs.default.readdirSync(resourcesDir))
|
||||||
|
result.entries.push({ name: "resources/" + name, path: toFilePathUrl(import_path.default.join(resourcesDir, name)) });
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
function toFilePathUrl(filePath) {
|
||||||
|
return `file?path=${encodeURIComponent(filePath)}`;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
installRootRedirect,
|
||||||
|
openTraceInBrowser,
|
||||||
|
openTraceViewerApp,
|
||||||
|
runTraceInBrowser,
|
||||||
|
runTraceViewerApp,
|
||||||
|
startTraceViewerServer
|
||||||
|
});
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"auditTime.js","sourceRoot":"","sources":["../../../../src/internal/operators/auditTime.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,cAAc,EAAE,MAAM,oBAAoB,CAAC;AACpD,OAAO,EAAE,KAAK,EAAE,MAAM,SAAS,CAAC;AAChC,OAAO,EAAE,KAAK,EAAE,MAAM,qBAAqB,CAAC;AAkD5C,MAAM,UAAU,SAAS,CAAI,QAAgB,EAAE,SAAyC;IAAzC,0BAAA,EAAA,0BAAyC;IACtF,OAAO,KAAK,CAAC,cAAM,OAAA,KAAK,CAAC,QAAQ,EAAE,SAAS,CAAC,EAA1B,CAA0B,CAAC,CAAC;AACjD,CAAC"}
|
||||||
@ -0,0 +1,28 @@
|
|||||||
|
'use strict';
|
||||||
|
|
||||||
|
import utils from '../utils.js';
|
||||||
|
import defaults from '../defaults/index.js';
|
||||||
|
import AxiosHeaders from '../core/AxiosHeaders.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Transform the data for a request or a response
|
||||||
|
*
|
||||||
|
* @param {Array|Function} fns A single function or Array of functions
|
||||||
|
* @param {?Object} response The response object
|
||||||
|
*
|
||||||
|
* @returns {*} The resulting transformed data
|
||||||
|
*/
|
||||||
|
export default function transformData(fns, response) {
|
||||||
|
const config = this || defaults;
|
||||||
|
const context = response || config;
|
||||||
|
const headers = AxiosHeaders.from(context.headers);
|
||||||
|
let data = context.data;
|
||||||
|
|
||||||
|
utils.forEach(fns, function transform(fn) {
|
||||||
|
data = fn.call(config, data, headers.normalize(), response ? response.status : undefined);
|
||||||
|
});
|
||||||
|
|
||||||
|
headers.normalize();
|
||||||
|
|
||||||
|
return data;
|
||||||
|
}
|
||||||
@ -0,0 +1,22 @@
|
|||||||
|
import type { ColumnBuilderBaseConfig } from "../../column-builder.cjs";
|
||||||
|
import type { ColumnBaseConfig } from "../../column.cjs";
|
||||||
|
import { entityKind } from "../../entity.cjs";
|
||||||
|
import { PgColumn, PgColumnBuilder } from "./common.cjs";
|
||||||
|
export type PgMacaddr8BuilderInitial<TName extends string> = PgMacaddr8Builder<{
|
||||||
|
name: TName;
|
||||||
|
dataType: 'string';
|
||||||
|
columnType: 'PgMacaddr8';
|
||||||
|
data: string;
|
||||||
|
driverParam: string;
|
||||||
|
enumValues: undefined;
|
||||||
|
}>;
|
||||||
|
export declare class PgMacaddr8Builder<T extends ColumnBuilderBaseConfig<'string', 'PgMacaddr8'>> extends PgColumnBuilder<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name']);
|
||||||
|
}
|
||||||
|
export declare class PgMacaddr8<T extends ColumnBaseConfig<'string', 'PgMacaddr8'>> extends PgColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
getSQLType(): string;
|
||||||
|
}
|
||||||
|
export declare function macaddr8(): PgMacaddr8BuilderInitial<''>;
|
||||||
|
export declare function macaddr8<TName extends string>(name: TName): PgMacaddr8BuilderInitial<TName>;
|
||||||
@ -0,0 +1,155 @@
|
|||||||
|
import type { ColumnBuilderBaseConfig } from "../../column-builder.cjs";
|
||||||
|
import type { ColumnBaseConfig } from "../../column.cjs";
|
||||||
|
import { entityKind } from "../../entity.cjs";
|
||||||
|
import type { AnyMySqlTable } from "../table.cjs";
|
||||||
|
import type { SQL } from "../../sql/sql.cjs";
|
||||||
|
import { type Equal } from "../../utils.cjs";
|
||||||
|
import { MySqlColumn, MySqlColumnBuilder } from "./common.cjs";
|
||||||
|
export type ConvertCustomConfig<TName extends string, T extends Partial<CustomTypeValues>> = {
|
||||||
|
name: TName;
|
||||||
|
dataType: 'custom';
|
||||||
|
columnType: 'MySqlCustomColumn';
|
||||||
|
data: T['data'];
|
||||||
|
driverParam: T['driverData'];
|
||||||
|
enumValues: undefined;
|
||||||
|
} & (T['notNull'] extends true ? {
|
||||||
|
notNull: true;
|
||||||
|
} : {}) & (T['default'] extends true ? {
|
||||||
|
hasDefault: true;
|
||||||
|
} : {});
|
||||||
|
export interface MySqlCustomColumnInnerConfig {
|
||||||
|
customTypeValues: CustomTypeValues;
|
||||||
|
}
|
||||||
|
export declare class MySqlCustomColumnBuilder<T extends ColumnBuilderBaseConfig<'custom', 'MySqlCustomColumn'>> extends MySqlColumnBuilder<T, {
|
||||||
|
fieldConfig: CustomTypeValues['config'];
|
||||||
|
customTypeParams: CustomTypeParams<any>;
|
||||||
|
}, {
|
||||||
|
mysqlColumnBuilderBrand: 'MySqlCustomColumnBuilderBrand';
|
||||||
|
}> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name'], fieldConfig: CustomTypeValues['config'], customTypeParams: CustomTypeParams<any>);
|
||||||
|
}
|
||||||
|
export declare class MySqlCustomColumn<T extends ColumnBaseConfig<'custom', 'MySqlCustomColumn'>> extends MySqlColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
private sqlName;
|
||||||
|
private mapTo?;
|
||||||
|
private mapFrom?;
|
||||||
|
constructor(table: AnyMySqlTable<{
|
||||||
|
name: T['tableName'];
|
||||||
|
}>, config: MySqlCustomColumnBuilder<T>['config']);
|
||||||
|
getSQLType(): string;
|
||||||
|
mapFromDriverValue(value: T['driverParam']): T['data'];
|
||||||
|
mapToDriverValue(value: T['data']): T['driverParam'];
|
||||||
|
}
|
||||||
|
export type CustomTypeValues = {
|
||||||
|
/**
|
||||||
|
* Required type for custom column, that will infer proper type model
|
||||||
|
*
|
||||||
|
* Examples:
|
||||||
|
*
|
||||||
|
* If you want your column to be `string` type after selecting/or on inserting - use `data: string`. Like `text`, `varchar`
|
||||||
|
*
|
||||||
|
* If you want your column to be `number` type after selecting/or on inserting - use `data: number`. Like `integer`
|
||||||
|
*/
|
||||||
|
data: unknown;
|
||||||
|
/**
|
||||||
|
* Type helper, that represents what type database driver is accepting for specific database data type
|
||||||
|
*/
|
||||||
|
driverData?: unknown;
|
||||||
|
/**
|
||||||
|
* What config type should be used for {@link CustomTypeParams} `dataType` generation
|
||||||
|
*/
|
||||||
|
config?: Record<string, any>;
|
||||||
|
/**
|
||||||
|
* Whether the config argument should be required or not
|
||||||
|
* @default false
|
||||||
|
*/
|
||||||
|
configRequired?: boolean;
|
||||||
|
/**
|
||||||
|
* If your custom data type should be notNull by default you can use `notNull: true`
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* const customSerial = customType<{ data: number, notNull: true, default: true }>({
|
||||||
|
* dataType() {
|
||||||
|
* return 'serial';
|
||||||
|
* },
|
||||||
|
* });
|
||||||
|
*/
|
||||||
|
notNull?: boolean;
|
||||||
|
/**
|
||||||
|
* If your custom data type has default you can use `default: true`
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* const customSerial = customType<{ data: number, notNull: true, default: true }>({
|
||||||
|
* dataType() {
|
||||||
|
* return 'serial';
|
||||||
|
* },
|
||||||
|
* });
|
||||||
|
*/
|
||||||
|
default?: boolean;
|
||||||
|
};
|
||||||
|
export interface CustomTypeParams<T extends CustomTypeValues> {
|
||||||
|
/**
|
||||||
|
* Database data type string representation, that is used for migrations
|
||||||
|
* @example
|
||||||
|
* ```
|
||||||
|
* `jsonb`, `text`
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* If database data type needs additional params you can use them from `config` param
|
||||||
|
* @example
|
||||||
|
* ```
|
||||||
|
* `varchar(256)`, `numeric(2,3)`
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* To make `config` be of specific type please use config generic in {@link CustomTypeValues}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* Usage example
|
||||||
|
* ```
|
||||||
|
* dataType() {
|
||||||
|
* return 'boolean';
|
||||||
|
* },
|
||||||
|
* ```
|
||||||
|
* Or
|
||||||
|
* ```
|
||||||
|
* dataType(config) {
|
||||||
|
* return typeof config.length !== 'undefined' ? `varchar(${config.length})` : `varchar`;
|
||||||
|
* }
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
dataType: (config: T['config'] | (Equal<T['configRequired'], true> extends true ? never : undefined)) => string;
|
||||||
|
/**
|
||||||
|
* Optional mapping function, between user input and driver
|
||||||
|
* @example
|
||||||
|
* For example, when using jsonb we need to map JS/TS object to string before writing to database
|
||||||
|
* ```
|
||||||
|
* toDriver(value: TData): string {
|
||||||
|
* return JSON.stringify(value);
|
||||||
|
* }
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
toDriver?: (value: T['data']) => T['driverData'] | SQL;
|
||||||
|
/**
|
||||||
|
* Optional mapping function, that is responsible for data mapping from database to JS/TS code
|
||||||
|
* @example
|
||||||
|
* For example, when using timestamp we need to map string Date representation to JS Date
|
||||||
|
* ```
|
||||||
|
* fromDriver(value: string): Date {
|
||||||
|
* return new Date(value);
|
||||||
|
* },
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
fromDriver?: (value: T['driverData']) => T['data'];
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Custom mysql database data type generator
|
||||||
|
*/
|
||||||
|
export declare function customType<T extends CustomTypeValues = CustomTypeValues>(customTypeParams: CustomTypeParams<T>): Equal<T['configRequired'], true> extends true ? {
|
||||||
|
<TConfig extends Record<string, any> & T['config']>(fieldConfig: TConfig): MySqlCustomColumnBuilder<ConvertCustomConfig<'', T>>;
|
||||||
|
<TName extends string>(dbName: TName, fieldConfig: T['config']): MySqlCustomColumnBuilder<ConvertCustomConfig<TName, T>>;
|
||||||
|
} : {
|
||||||
|
(): MySqlCustomColumnBuilder<ConvertCustomConfig<'', T>>;
|
||||||
|
<TConfig extends Record<string, any> & T['config']>(fieldConfig?: TConfig): MySqlCustomColumnBuilder<ConvertCustomConfig<'', T>>;
|
||||||
|
<TName extends string>(dbName: TName, fieldConfig?: T['config']): MySqlCustomColumnBuilder<ConvertCustomConfig<TName, T>>;
|
||||||
|
};
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/mysql-core/columns/date.common.ts"],"sourcesContent":["import type {\n\tColumnBuilderBaseConfig,\n\tColumnBuilderExtraConfig,\n\tColumnDataType,\n\tHasDefault,\n} from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport { sql } from '~/sql/sql.ts';\nimport { MySqlColumn, MySqlColumnBuilder } from './common.ts';\n\nexport interface MySqlDateColumnBaseConfig {\n\thasOnUpdateNow: boolean;\n}\n\nexport abstract class MySqlDateColumnBaseBuilder<\n\tT extends ColumnBuilderBaseConfig<ColumnDataType, string>,\n\tTRuntimeConfig extends object = object,\n\tTExtraConfig extends ColumnBuilderExtraConfig = ColumnBuilderExtraConfig,\n> extends MySqlColumnBuilder<T, TRuntimeConfig & MySqlDateColumnBaseConfig, TExtraConfig> {\n\tstatic override readonly [entityKind]: string = 'MySqlDateColumnBuilder';\n\n\tdefaultNow() {\n\t\treturn this.default(sql`(now())`);\n\t}\n\n\t// \"on update now\" also adds an implicit default value to the column - https://dev.mysql.com/doc/refman/8.0/en/timestamp-initialization.html\n\tonUpdateNow(): HasDefault<this> {\n\t\tthis.config.hasOnUpdateNow = true;\n\t\tthis.config.hasDefault = true;\n\t\treturn this as HasDefault<this>;\n\t}\n}\n\nexport abstract class MySqlDateBaseColumn<\n\tT extends ColumnBaseConfig<ColumnDataType, string>,\n\tTRuntimeConfig extends object = object,\n> extends MySqlColumn<T, MySqlDateColumnBaseConfig & TRuntimeConfig> {\n\tstatic override readonly [entityKind]: string = 'MySqlDateColumn';\n\n\treadonly hasOnUpdateNow: boolean = this.config.hasOnUpdateNow;\n}\n"],"mappings":"AAOA,SAAS,kBAAkB;AAC3B,SAAS,WAAW;AACpB,SAAS,aAAa,0BAA0B;AAMzC,MAAe,mCAIZ,mBAAgF;AAAA,EACzF,QAA0B,UAAU,IAAY;AAAA,EAEhD,aAAa;AACZ,WAAO,KAAK,QAAQ,YAAY;AAAA,EACjC;AAAA;AAAA,EAGA,cAAgC;AAC/B,SAAK,OAAO,iBAAiB;AAC7B,SAAK,OAAO,aAAa;AACzB,WAAO;AAAA,EACR;AACD;AAEO,MAAe,4BAGZ,YAA2D;AAAA,EACpE,QAA0B,UAAU,IAAY;AAAA,EAEvC,iBAA0B,KAAK,OAAO;AAChD;","names":[]}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/pg-core/columns/cidr.ts"],"sourcesContent":["import type { ColumnBuilderBaseConfig, ColumnBuilderRuntimeConfig, MakeColumnConfig } from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnyPgTable } from '../table.ts';\nimport { PgColumn, PgColumnBuilder } from './common.ts';\n\nexport type PgCidrBuilderInitial<TName extends string> = PgCidrBuilder<{\n\tname: TName;\n\tdataType: 'string';\n\tcolumnType: 'PgCidr';\n\tdata: string;\n\tdriverParam: string;\n\tenumValues: undefined;\n}>;\n\nexport class PgCidrBuilder<T extends ColumnBuilderBaseConfig<'string', 'PgCidr'>> extends PgColumnBuilder<T> {\n\tstatic override readonly [entityKind]: string = 'PgCidrBuilder';\n\n\tconstructor(name: T['name']) {\n\t\tsuper(name, 'string', 'PgCidr');\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnyPgTable<{ name: TTableName }>,\n\t): PgCidr<MakeColumnConfig<T, TTableName>> {\n\t\treturn new PgCidr<MakeColumnConfig<T, TTableName>>(table, this.config as ColumnBuilderRuntimeConfig<any, any>);\n\t}\n}\n\nexport class PgCidr<T extends ColumnBaseConfig<'string', 'PgCidr'>> extends PgColumn<T> {\n\tstatic override readonly [entityKind]: string = 'PgCidr';\n\n\tgetSQLType(): string {\n\t\treturn 'cidr';\n\t}\n}\n\nexport function cidr(): PgCidrBuilderInitial<''>;\nexport function cidr<TName extends string>(name: TName): PgCidrBuilderInitial<TName>;\nexport function cidr(name?: string) {\n\treturn new PgCidrBuilder(name ?? '');\n}\n"],"mappings":"AAEA,SAAS,kBAAkB;AAE3B,SAAS,UAAU,uBAAuB;AAWnC,MAAM,sBAA6E,gBAAmB;AAAA,EAC5G,QAA0B,UAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB;AAC5B,UAAM,MAAM,UAAU,QAAQ;AAAA,EAC/B;AAAA;AAAA,EAGS,MACR,OAC0C;AAC1C,WAAO,IAAI,OAAwC,OAAO,KAAK,MAA8C;AAAA,EAC9G;AACD;AAEO,MAAM,eAA+D,SAAY;AAAA,EACvF,QAA0B,UAAU,IAAY;AAAA,EAEhD,aAAqB;AACpB,WAAO;AAAA,EACR;AACD;AAIO,SAAS,KAAK,MAAe;AACnC,SAAO,IAAI,cAAc,QAAQ,EAAE;AACpC;","names":[]}
|
||||||
File diff suppressed because one or more lines are too long
@ -0,0 +1,890 @@
|
|||||||
|
"use strict";
|
||||||
|
var __create = Object.create;
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __getProtoOf = Object.getPrototypeOf;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||||
|
// If the importer is in node compatibility mode or this is not an ESM
|
||||||
|
// file that has been converted to a CommonJS file using a Babel-
|
||||||
|
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||||
|
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||||
|
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||||
|
mod
|
||||||
|
));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var page_exports = {};
|
||||||
|
__export(page_exports, {
|
||||||
|
InitScript: () => InitScript,
|
||||||
|
Page: () => Page,
|
||||||
|
PageBinding: () => PageBinding,
|
||||||
|
Worker: () => Worker,
|
||||||
|
WorkerEvent: () => WorkerEvent,
|
||||||
|
ariaSnapshotForFrame: () => ariaSnapshotForFrame
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(page_exports);
|
||||||
|
var import_browserContext = require("./browserContext");
|
||||||
|
var import_disposable = require("./disposable");
|
||||||
|
var import_console = require("./console");
|
||||||
|
var import_errors = require("./errors");
|
||||||
|
var import_fileChooser = require("./fileChooser");
|
||||||
|
var frames = __toESM(require("./frames"));
|
||||||
|
var import_helper = require("./helper");
|
||||||
|
var input = __toESM(require("./input"));
|
||||||
|
var import_instrumentation = require("./instrumentation");
|
||||||
|
var js = __toESM(require("./javascript"));
|
||||||
|
var import_screenshotter = require("./screenshotter");
|
||||||
|
var import_utils = require("../utils");
|
||||||
|
var import_utils2 = require("../utils");
|
||||||
|
var import_comparators = require("./utils/comparators");
|
||||||
|
var import_debugLogger = require("./utils/debugLogger");
|
||||||
|
var import_selectorParser = require("../utils/isomorphic/selectorParser");
|
||||||
|
var import_manualPromise = require("../utils/isomorphic/manualPromise");
|
||||||
|
var import_utilityScriptSerializers = require("../utils/isomorphic/utilityScriptSerializers");
|
||||||
|
var import_callLog = require("./callLog");
|
||||||
|
var rawBindingsControllerSource = __toESM(require("../generated/bindingsControllerSource"));
|
||||||
|
var import_overlay = require("./overlay");
|
||||||
|
var import_dom = require("./dom");
|
||||||
|
var import_screencast = require("./screencast");
|
||||||
|
const PageEvent = {
|
||||||
|
Close: "close",
|
||||||
|
Crash: "crash",
|
||||||
|
Download: "download",
|
||||||
|
EmulatedSizeChanged: "emulatedsizechanged",
|
||||||
|
FileChooser: "filechooser",
|
||||||
|
FrameAttached: "frameattached",
|
||||||
|
FrameDetached: "framedetached",
|
||||||
|
InternalFrameNavigatedToNewDocument: "internalframenavigatedtonewdocument",
|
||||||
|
LocatorHandlerTriggered: "locatorhandlertriggered",
|
||||||
|
WebSocket: "websocket",
|
||||||
|
Worker: "worker"
|
||||||
|
};
|
||||||
|
const navigationMarkSymbol = Symbol("navigationMark");
|
||||||
|
class Page extends import_instrumentation.SdkObject {
|
||||||
|
constructor(delegate, browserContext) {
|
||||||
|
super(browserContext, "page");
|
||||||
|
this._closedState = "open";
|
||||||
|
this.closedPromise = new import_manualPromise.ManualPromise();
|
||||||
|
this._initializedPromise = new import_manualPromise.ManualPromise();
|
||||||
|
this._consoleMessages = [];
|
||||||
|
this._pageErrors = [];
|
||||||
|
this._crashed = false;
|
||||||
|
this.openScope = new import_utils.LongStandingScope();
|
||||||
|
this._emulatedMedia = {};
|
||||||
|
this._fileChooserInterceptedBy = /* @__PURE__ */ new Set();
|
||||||
|
this._pageBindings = /* @__PURE__ */ new Map();
|
||||||
|
this.initScripts = [];
|
||||||
|
this._workers = /* @__PURE__ */ new Map();
|
||||||
|
this.requestInterceptors = [];
|
||||||
|
this._locatorHandlers = /* @__PURE__ */ new Map();
|
||||||
|
this._lastLocatorHandlerUid = 0;
|
||||||
|
this._locatorHandlerRunningCounter = 0;
|
||||||
|
this._networkRequests = [];
|
||||||
|
this.attribution.page = this;
|
||||||
|
this.delegate = delegate;
|
||||||
|
this.browserContext = browserContext;
|
||||||
|
this.keyboard = new input.Keyboard(delegate.rawKeyboard, this);
|
||||||
|
this.mouse = new input.Mouse(delegate.rawMouse, this);
|
||||||
|
this.touchscreen = new input.Touchscreen(delegate.rawTouchscreen, this);
|
||||||
|
this.screenshotter = new import_screenshotter.Screenshotter(this);
|
||||||
|
this.frameManager = new frames.FrameManager(this);
|
||||||
|
this.overlay = new import_overlay.Overlay(this);
|
||||||
|
this.screencast = new import_screencast.Screencast(this);
|
||||||
|
if (delegate.pdf)
|
||||||
|
this.pdf = delegate.pdf.bind(delegate);
|
||||||
|
this.coverage = delegate.coverage ? delegate.coverage() : null;
|
||||||
|
this.isStorageStatePage = browserContext.isCreatingStorageStatePage();
|
||||||
|
}
|
||||||
|
static {
|
||||||
|
this.Events = PageEvent;
|
||||||
|
}
|
||||||
|
async reportAsNew(opener, error) {
|
||||||
|
if (opener) {
|
||||||
|
const openerPageOrError = await opener.waitForInitializedOrError();
|
||||||
|
if (openerPageOrError instanceof Page && !openerPageOrError.isClosed())
|
||||||
|
this._opener = openerPageOrError;
|
||||||
|
}
|
||||||
|
this._markInitialized(error);
|
||||||
|
}
|
||||||
|
_markInitialized(error = void 0) {
|
||||||
|
if (error) {
|
||||||
|
if (this.browserContext.isClosingOrClosed())
|
||||||
|
return;
|
||||||
|
this.frameManager.createDummyMainFrameIfNeeded();
|
||||||
|
}
|
||||||
|
this._initialized = error || this;
|
||||||
|
this.emitOnContext(import_browserContext.BrowserContext.Events.Page, this);
|
||||||
|
for (const pageError of this._pageErrors)
|
||||||
|
this.emitOnContext(import_browserContext.BrowserContext.Events.PageError, pageError, this);
|
||||||
|
for (const message of this._consoleMessages)
|
||||||
|
this.emitOnContext(import_browserContext.BrowserContext.Events.Console, message);
|
||||||
|
if (this.isClosed())
|
||||||
|
this.emit(Page.Events.Close);
|
||||||
|
else
|
||||||
|
this.instrumentation.onPageOpen(this);
|
||||||
|
this._initializedPromise.resolve(this._initialized);
|
||||||
|
}
|
||||||
|
initializedOrUndefined() {
|
||||||
|
return this._initialized ? this : void 0;
|
||||||
|
}
|
||||||
|
waitForInitializedOrError() {
|
||||||
|
return this._initializedPromise;
|
||||||
|
}
|
||||||
|
emitOnContext(event, ...args) {
|
||||||
|
if (this.isStorageStatePage)
|
||||||
|
return;
|
||||||
|
this.browserContext.emit(event, ...args);
|
||||||
|
}
|
||||||
|
async resetForReuse(progress) {
|
||||||
|
await this.mainFrame().gotoImpl(progress, "about:blank", {});
|
||||||
|
this._emulatedSize = void 0;
|
||||||
|
this._emulatedMedia = {};
|
||||||
|
this._extraHTTPHeaders = void 0;
|
||||||
|
await Promise.all([
|
||||||
|
this.delegate.updateEmulatedViewportSize(),
|
||||||
|
this.delegate.updateEmulateMedia(),
|
||||||
|
this.delegate.updateExtraHTTPHeaders()
|
||||||
|
]);
|
||||||
|
await this.delegate.resetForReuse(progress);
|
||||||
|
}
|
||||||
|
_didClose() {
|
||||||
|
this.frameManager.dispose();
|
||||||
|
this.screencast.dispose();
|
||||||
|
this.overlay.dispose();
|
||||||
|
(0, import_utils.assert)(this._closedState !== "closed", "Page closed twice");
|
||||||
|
this._closedState = "closed";
|
||||||
|
this.emit(Page.Events.Close);
|
||||||
|
this.browserContext.emit(import_browserContext.BrowserContext.Events.PageClosed, this);
|
||||||
|
this.closedPromise.resolve();
|
||||||
|
this.instrumentation.onPageClose(this);
|
||||||
|
this.openScope.close(new import_errors.TargetClosedError(this.closeReason()));
|
||||||
|
}
|
||||||
|
_didCrash() {
|
||||||
|
this.frameManager.dispose();
|
||||||
|
this.screencast.dispose();
|
||||||
|
this.overlay.dispose();
|
||||||
|
this.emit(Page.Events.Crash);
|
||||||
|
this._crashed = true;
|
||||||
|
this.instrumentation.onPageClose(this);
|
||||||
|
this.openScope.close(new Error("Page crashed"));
|
||||||
|
}
|
||||||
|
async _onFileChooserOpened(handle) {
|
||||||
|
let multiple;
|
||||||
|
try {
|
||||||
|
multiple = await handle.evaluate((element) => !!element.multiple);
|
||||||
|
} catch (e) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (!this.listenerCount(Page.Events.FileChooser)) {
|
||||||
|
handle.dispose();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const fileChooser = new import_fileChooser.FileChooser(this, handle, multiple);
|
||||||
|
this.emit(Page.Events.FileChooser, fileChooser);
|
||||||
|
}
|
||||||
|
opener() {
|
||||||
|
return this._opener;
|
||||||
|
}
|
||||||
|
mainFrame() {
|
||||||
|
return this.frameManager.mainFrame();
|
||||||
|
}
|
||||||
|
frames() {
|
||||||
|
return this.frameManager.frames();
|
||||||
|
}
|
||||||
|
async exposeBinding(progress, name, needsHandle, playwrightBinding) {
|
||||||
|
if (this._pageBindings.has(name))
|
||||||
|
throw new Error(`Function "${name}" has been already registered`);
|
||||||
|
if (this.browserContext._pageBindings.has(name))
|
||||||
|
throw new Error(`Function "${name}" has been already registered in the browser context`);
|
||||||
|
await progress.race(this.browserContext.exposePlaywrightBindingIfNeeded());
|
||||||
|
const binding = new PageBinding(this, name, playwrightBinding, needsHandle);
|
||||||
|
this._pageBindings.set(name, binding);
|
||||||
|
try {
|
||||||
|
await progress.race(this.delegate.addInitScript(binding.initScript));
|
||||||
|
await progress.race(this.safeNonStallingEvaluateInAllFrames(binding.initScript.source, "main"));
|
||||||
|
return binding;
|
||||||
|
} catch (error) {
|
||||||
|
this._pageBindings.delete(name);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async removeExposedBinding(binding) {
|
||||||
|
if (this._pageBindings.get(binding.name) !== binding)
|
||||||
|
return;
|
||||||
|
this._pageBindings.delete(binding.name);
|
||||||
|
await this.delegate.removeInitScripts([binding.initScript]);
|
||||||
|
const cleanup = `{ ${binding.cleanupScript} };`;
|
||||||
|
await this.safeNonStallingEvaluateInAllFrames(cleanup, "main");
|
||||||
|
}
|
||||||
|
async setExtraHTTPHeaders(progress, headers) {
|
||||||
|
const oldHeaders = this._extraHTTPHeaders;
|
||||||
|
try {
|
||||||
|
this._extraHTTPHeaders = headers;
|
||||||
|
await progress.race(this.delegate.updateExtraHTTPHeaders());
|
||||||
|
} catch (error) {
|
||||||
|
this._extraHTTPHeaders = oldHeaders;
|
||||||
|
this.delegate.updateExtraHTTPHeaders().catch(() => {
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
extraHTTPHeaders() {
|
||||||
|
return this._extraHTTPHeaders;
|
||||||
|
}
|
||||||
|
addNetworkRequest(request) {
|
||||||
|
this._networkRequests.push(request);
|
||||||
|
ensureArrayLimit(this._networkRequests, 100);
|
||||||
|
}
|
||||||
|
networkRequests() {
|
||||||
|
return this._networkRequests;
|
||||||
|
}
|
||||||
|
async onBindingCalled(payload, context) {
|
||||||
|
if (this._closedState === "closed")
|
||||||
|
return;
|
||||||
|
await PageBinding.dispatch(this, payload, context);
|
||||||
|
}
|
||||||
|
addConsoleMessage(worker, type, args, location, text, timestamp) {
|
||||||
|
const message = new import_console.ConsoleMessage(this, worker, type, text, args, location, timestamp);
|
||||||
|
const intercepted = this.frameManager.interceptConsoleMessage(message);
|
||||||
|
if (intercepted) {
|
||||||
|
args.forEach((arg) => arg.dispose());
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
this._consoleMessages.push(message);
|
||||||
|
ensureArrayLimit(this._consoleMessages, 200);
|
||||||
|
if (this._initialized)
|
||||||
|
this.emitOnContext(import_browserContext.BrowserContext.Events.Console, message);
|
||||||
|
}
|
||||||
|
clearConsoleMessages() {
|
||||||
|
this._consoleMessages.length = 0;
|
||||||
|
}
|
||||||
|
consoleMessages(filter) {
|
||||||
|
if (filter === "all")
|
||||||
|
return this._consoleMessages;
|
||||||
|
const marked = this._consoleMessages.findLastIndex((m) => m[navigationMarkSymbol]);
|
||||||
|
return marked === -1 ? this._consoleMessages : this._consoleMessages.slice(marked + 1);
|
||||||
|
}
|
||||||
|
addPageError(pageError) {
|
||||||
|
this._pageErrors.push(pageError);
|
||||||
|
ensureArrayLimit(this._pageErrors, 200);
|
||||||
|
if (this._initialized)
|
||||||
|
this.emitOnContext(import_browserContext.BrowserContext.Events.PageError, pageError, this);
|
||||||
|
}
|
||||||
|
clearPageErrors() {
|
||||||
|
this._pageErrors.length = 0;
|
||||||
|
}
|
||||||
|
pageErrors(filter) {
|
||||||
|
if (filter === "all")
|
||||||
|
return this._pageErrors;
|
||||||
|
const marked = this._pageErrors.findLastIndex((e) => e[navigationMarkSymbol]);
|
||||||
|
return marked === -1 ? this._pageErrors : this._pageErrors.slice(marked + 1);
|
||||||
|
}
|
||||||
|
async reload(progress, options) {
|
||||||
|
return this.mainFrame().raceNavigationAction(progress, async () => {
|
||||||
|
const [response] = await Promise.all([
|
||||||
|
// Reload must be a new document, and should not be confused with a stray pushState.
|
||||||
|
this.mainFrame()._waitForNavigation(progress, true, options),
|
||||||
|
progress.race(this.delegate.reload())
|
||||||
|
]);
|
||||||
|
return response;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async goBack(progress, options) {
|
||||||
|
return this.mainFrame().raceNavigationAction(progress, async () => {
|
||||||
|
let error;
|
||||||
|
const waitPromise = this.mainFrame()._waitForNavigation(progress, false, options).catch((e) => {
|
||||||
|
error = e;
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
const result = await progress.race(this.delegate.goBack());
|
||||||
|
if (!result) {
|
||||||
|
waitPromise.catch(() => {
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const response = await waitPromise;
|
||||||
|
if (error)
|
||||||
|
throw error;
|
||||||
|
return response;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async goForward(progress, options) {
|
||||||
|
return this.mainFrame().raceNavigationAction(progress, async () => {
|
||||||
|
let error;
|
||||||
|
const waitPromise = this.mainFrame()._waitForNavigation(progress, false, options).catch((e) => {
|
||||||
|
error = e;
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
const result = await progress.race(this.delegate.goForward());
|
||||||
|
if (!result) {
|
||||||
|
waitPromise.catch(() => {
|
||||||
|
});
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const response = await waitPromise;
|
||||||
|
if (error)
|
||||||
|
throw error;
|
||||||
|
return response;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
requestGC() {
|
||||||
|
return this.delegate.requestGC();
|
||||||
|
}
|
||||||
|
registerLocatorHandler(selector, noWaitAfter) {
|
||||||
|
const uid = ++this._lastLocatorHandlerUid;
|
||||||
|
this._locatorHandlers.set(uid, { selector, noWaitAfter });
|
||||||
|
return uid;
|
||||||
|
}
|
||||||
|
resolveLocatorHandler(uid, remove) {
|
||||||
|
const handler = this._locatorHandlers.get(uid);
|
||||||
|
if (remove)
|
||||||
|
this._locatorHandlers.delete(uid);
|
||||||
|
if (handler) {
|
||||||
|
handler.resolved?.resolve();
|
||||||
|
handler.resolved = void 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
unregisterLocatorHandler(uid) {
|
||||||
|
this._locatorHandlers.delete(uid);
|
||||||
|
}
|
||||||
|
async performActionPreChecks(progress) {
|
||||||
|
await this._performWaitForNavigationCheck(progress);
|
||||||
|
await this._performLocatorHandlersCheckpoint(progress);
|
||||||
|
await this._performWaitForNavigationCheck(progress);
|
||||||
|
}
|
||||||
|
async _performWaitForNavigationCheck(progress) {
|
||||||
|
if (process.env.PLAYWRIGHT_SKIP_NAVIGATION_CHECK)
|
||||||
|
return;
|
||||||
|
const mainFrame = this.frameManager.mainFrame();
|
||||||
|
if (!mainFrame || !mainFrame.pendingDocument())
|
||||||
|
return;
|
||||||
|
const url = mainFrame.pendingDocument()?.request?.url();
|
||||||
|
const toUrl = url ? `" ${(0, import_utils.trimStringWithEllipsis)(url, 200)}"` : "";
|
||||||
|
progress.log(` waiting for${toUrl} navigation to finish...`);
|
||||||
|
await import_helper.helper.waitForEvent(progress, mainFrame, frames.Frame.Events.InternalNavigation, (e) => {
|
||||||
|
if (!e.isPublic)
|
||||||
|
return false;
|
||||||
|
if (!e.error)
|
||||||
|
progress.log(` navigated to "${(0, import_utils.trimStringWithEllipsis)(mainFrame.url(), 200)}"`);
|
||||||
|
return true;
|
||||||
|
}).promise;
|
||||||
|
}
|
||||||
|
async _performLocatorHandlersCheckpoint(progress) {
|
||||||
|
if (this._locatorHandlerRunningCounter)
|
||||||
|
return;
|
||||||
|
for (const [uid, handler] of this._locatorHandlers) {
|
||||||
|
if (!handler.resolved) {
|
||||||
|
if (await this.mainFrame().isVisibleInternal(progress, handler.selector, { strict: true })) {
|
||||||
|
handler.resolved = new import_manualPromise.ManualPromise();
|
||||||
|
this.emit(Page.Events.LocatorHandlerTriggered, uid);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (handler.resolved) {
|
||||||
|
++this._locatorHandlerRunningCounter;
|
||||||
|
progress.log(` found ${(0, import_utils2.asLocator)(this.browserContext._browser.sdkLanguage(), handler.selector)}, intercepting action to run the handler`);
|
||||||
|
const promise = handler.resolved.then(async () => {
|
||||||
|
if (!handler.noWaitAfter) {
|
||||||
|
progress.log(` locator handler has finished, waiting for ${(0, import_utils2.asLocator)(this.browserContext._browser.sdkLanguage(), handler.selector)} to be hidden`);
|
||||||
|
await this.mainFrame().waitForSelector(progress, handler.selector, false, { state: "hidden" });
|
||||||
|
} else {
|
||||||
|
progress.log(` locator handler has finished`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
await progress.race(this.openScope.race(promise)).finally(() => --this._locatorHandlerRunningCounter);
|
||||||
|
progress.log(` interception handler has finished, continuing`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async emulateMedia(progress, options) {
|
||||||
|
const oldEmulatedMedia = { ...this._emulatedMedia };
|
||||||
|
if (options.media !== void 0)
|
||||||
|
this._emulatedMedia.media = options.media;
|
||||||
|
if (options.colorScheme !== void 0)
|
||||||
|
this._emulatedMedia.colorScheme = options.colorScheme;
|
||||||
|
if (options.reducedMotion !== void 0)
|
||||||
|
this._emulatedMedia.reducedMotion = options.reducedMotion;
|
||||||
|
if (options.forcedColors !== void 0)
|
||||||
|
this._emulatedMedia.forcedColors = options.forcedColors;
|
||||||
|
if (options.contrast !== void 0)
|
||||||
|
this._emulatedMedia.contrast = options.contrast;
|
||||||
|
try {
|
||||||
|
await progress.race(this.delegate.updateEmulateMedia());
|
||||||
|
} catch (error) {
|
||||||
|
this._emulatedMedia = oldEmulatedMedia;
|
||||||
|
this.delegate.updateEmulateMedia().catch(() => {
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
emulatedMedia() {
|
||||||
|
const contextOptions = this.browserContext._options;
|
||||||
|
return {
|
||||||
|
media: this._emulatedMedia.media || "no-override",
|
||||||
|
colorScheme: this._emulatedMedia.colorScheme !== void 0 ? this._emulatedMedia.colorScheme : contextOptions.colorScheme ?? "light",
|
||||||
|
reducedMotion: this._emulatedMedia.reducedMotion !== void 0 ? this._emulatedMedia.reducedMotion : contextOptions.reducedMotion ?? "no-preference",
|
||||||
|
forcedColors: this._emulatedMedia.forcedColors !== void 0 ? this._emulatedMedia.forcedColors : contextOptions.forcedColors ?? "none",
|
||||||
|
contrast: this._emulatedMedia.contrast !== void 0 ? this._emulatedMedia.contrast : contextOptions.contrast ?? "no-preference"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
async setViewportSize(progress, viewportSize) {
|
||||||
|
const oldEmulatedSize = this._emulatedSize;
|
||||||
|
try {
|
||||||
|
this._setEmulatedSize({ viewport: { ...viewportSize }, screen: { ...viewportSize } });
|
||||||
|
await progress.race(this.delegate.updateEmulatedViewportSize());
|
||||||
|
} catch (error) {
|
||||||
|
this._emulatedSize = oldEmulatedSize;
|
||||||
|
this.delegate.updateEmulatedViewportSize().catch(() => {
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
setEmulatedSizeFromWindowOpen(emulatedSize) {
|
||||||
|
this._setEmulatedSize(emulatedSize);
|
||||||
|
}
|
||||||
|
_setEmulatedSize(emulatedSize) {
|
||||||
|
this._emulatedSize = emulatedSize;
|
||||||
|
this.emit(Page.Events.EmulatedSizeChanged);
|
||||||
|
}
|
||||||
|
emulatedSize() {
|
||||||
|
if (this._emulatedSize)
|
||||||
|
return this._emulatedSize;
|
||||||
|
const contextOptions = this.browserContext._options;
|
||||||
|
return contextOptions.viewport ? { viewport: contextOptions.viewport, screen: contextOptions.screen || contextOptions.viewport } : void 0;
|
||||||
|
}
|
||||||
|
async bringToFront() {
|
||||||
|
await this.delegate.bringToFront();
|
||||||
|
}
|
||||||
|
async addInitScript(source) {
|
||||||
|
const initScript = new InitScript(this, source);
|
||||||
|
this.initScripts.push(initScript);
|
||||||
|
try {
|
||||||
|
await this.delegate.addInitScript(initScript);
|
||||||
|
} catch (error) {
|
||||||
|
initScript.dispose().catch(() => {
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
return initScript;
|
||||||
|
}
|
||||||
|
async removeInitScript(initScript) {
|
||||||
|
this.initScripts = this.initScripts.filter((script) => initScript !== script);
|
||||||
|
await this.delegate.removeInitScripts([initScript]);
|
||||||
|
}
|
||||||
|
needsRequestInterception() {
|
||||||
|
return this.requestInterceptors.length > 0 || this.browserContext.requestInterceptors.length > 0;
|
||||||
|
}
|
||||||
|
async addRequestInterceptor(progress, handler, prepend) {
|
||||||
|
if (prepend)
|
||||||
|
this.requestInterceptors.unshift(handler);
|
||||||
|
else
|
||||||
|
this.requestInterceptors.push(handler);
|
||||||
|
await this.delegate.updateRequestInterception();
|
||||||
|
}
|
||||||
|
async removeRequestInterceptor(handler) {
|
||||||
|
const index = this.requestInterceptors.indexOf(handler);
|
||||||
|
if (index === -1)
|
||||||
|
return;
|
||||||
|
this.requestInterceptors.splice(index, 1);
|
||||||
|
await this.browserContext.notifyRoutesInFlightAboutRemovedHandler(handler);
|
||||||
|
await this.delegate.updateRequestInterception();
|
||||||
|
}
|
||||||
|
async expectScreenshot(progress, options) {
|
||||||
|
const locator = options.locator;
|
||||||
|
const rafrafScreenshot = locator ? async (timeout) => {
|
||||||
|
return await locator.frame.rafrafTimeoutScreenshotElementWithProgress(progress, locator.selector, timeout, options || {});
|
||||||
|
} : async (timeout) => {
|
||||||
|
await this.performActionPreChecks(progress);
|
||||||
|
await this.mainFrame().rafrafTimeout(progress, timeout);
|
||||||
|
return await this.screenshotter.screenshotPage(progress, options || {});
|
||||||
|
};
|
||||||
|
const comparator = (0, import_comparators.getComparator)("image/png");
|
||||||
|
if (!options.expected && options.isNot)
|
||||||
|
return { errorMessage: '"not" matcher requires expected result' };
|
||||||
|
try {
|
||||||
|
const format = (0, import_screenshotter.validateScreenshotOptions)(options || {});
|
||||||
|
if (format !== "png")
|
||||||
|
throw new Error("Only PNG screenshots are supported");
|
||||||
|
} catch (error) {
|
||||||
|
return { errorMessage: error.message };
|
||||||
|
}
|
||||||
|
let intermediateResult;
|
||||||
|
const areEqualScreenshots = (actual, expected, previous) => {
|
||||||
|
const comparatorResult = actual && expected ? comparator(actual, expected, options) : void 0;
|
||||||
|
if (comparatorResult !== void 0 && !!comparatorResult === !!options.isNot)
|
||||||
|
return true;
|
||||||
|
if (comparatorResult)
|
||||||
|
intermediateResult = { errorMessage: comparatorResult.errorMessage, diff: comparatorResult.diff, actual, previous };
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
try {
|
||||||
|
let actual;
|
||||||
|
let previous;
|
||||||
|
const pollIntervals = [0, 100, 250, 500];
|
||||||
|
progress.log(`${(0, import_utils.renderTitleForCall)(progress.metadata)}${options.timeout ? ` with timeout ${options.timeout}ms` : ""}`);
|
||||||
|
if (options.expected)
|
||||||
|
progress.log(` verifying given screenshot expectation`);
|
||||||
|
else
|
||||||
|
progress.log(` generating new stable screenshot expectation`);
|
||||||
|
let isFirstIteration = true;
|
||||||
|
while (true) {
|
||||||
|
if (this.isClosed())
|
||||||
|
throw new Error("The page has closed");
|
||||||
|
const screenshotTimeout = pollIntervals.shift() ?? 1e3;
|
||||||
|
if (screenshotTimeout)
|
||||||
|
progress.log(`waiting ${screenshotTimeout}ms before taking screenshot`);
|
||||||
|
previous = actual;
|
||||||
|
actual = await rafrafScreenshot(screenshotTimeout).catch((e) => {
|
||||||
|
if (this.mainFrame().isNonRetriableError(e))
|
||||||
|
throw e;
|
||||||
|
progress.log(`failed to take screenshot - ` + e.message);
|
||||||
|
return void 0;
|
||||||
|
});
|
||||||
|
if (!actual)
|
||||||
|
continue;
|
||||||
|
const expectation = options.expected && isFirstIteration ? options.expected : previous;
|
||||||
|
if (areEqualScreenshots(actual, expectation, previous))
|
||||||
|
break;
|
||||||
|
if (intermediateResult)
|
||||||
|
progress.log(intermediateResult.errorMessage);
|
||||||
|
isFirstIteration = false;
|
||||||
|
}
|
||||||
|
if (!isFirstIteration)
|
||||||
|
progress.log(`captured a stable screenshot`);
|
||||||
|
if (!options.expected)
|
||||||
|
return { actual };
|
||||||
|
if (isFirstIteration) {
|
||||||
|
progress.log(`screenshot matched expectation`);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
if (areEqualScreenshots(actual, options.expected, void 0)) {
|
||||||
|
progress.log(`screenshot matched expectation`);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
throw new Error(intermediateResult.errorMessage);
|
||||||
|
} catch (e) {
|
||||||
|
if (js.isJavaScriptErrorInEvaluate(e) || (0, import_selectorParser.isInvalidSelectorError)(e))
|
||||||
|
throw e;
|
||||||
|
let errorMessage = e.message;
|
||||||
|
if (e instanceof import_errors.TimeoutError && intermediateResult?.previous)
|
||||||
|
errorMessage = `Failed to take two consecutive stable screenshots.`;
|
||||||
|
return {
|
||||||
|
log: (0, import_callLog.compressCallLog)(e.message ? [...progress.metadata.log, e.message] : progress.metadata.log),
|
||||||
|
...intermediateResult,
|
||||||
|
errorMessage,
|
||||||
|
timedOut: e instanceof import_errors.TimeoutError
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async screenshot(progress, options) {
|
||||||
|
return await this.screenshotter.screenshotPage(progress, options);
|
||||||
|
}
|
||||||
|
async close(options = {}) {
|
||||||
|
if (this._closedState === "closed")
|
||||||
|
return;
|
||||||
|
if (options.reason)
|
||||||
|
this._closeReason = options.reason;
|
||||||
|
const runBeforeUnload = !!options.runBeforeUnload;
|
||||||
|
if (!runBeforeUnload)
|
||||||
|
await this.screencast.handlePageOrContextClose();
|
||||||
|
if (this._closedState !== "closing") {
|
||||||
|
if (!runBeforeUnload)
|
||||||
|
this._closedState = "closing";
|
||||||
|
await this.delegate.closePage(runBeforeUnload).catch((e) => import_debugLogger.debugLogger.log("error", e));
|
||||||
|
}
|
||||||
|
if (!runBeforeUnload)
|
||||||
|
await this.closedPromise;
|
||||||
|
}
|
||||||
|
isClosed() {
|
||||||
|
return this._closedState === "closed";
|
||||||
|
}
|
||||||
|
hasCrashed() {
|
||||||
|
return this._crashed;
|
||||||
|
}
|
||||||
|
isClosedOrClosingOrCrashed() {
|
||||||
|
return this._closedState !== "open" || this._crashed;
|
||||||
|
}
|
||||||
|
addWorker(workerId, worker) {
|
||||||
|
this._workers.set(workerId, worker);
|
||||||
|
this.emit(Page.Events.Worker, worker);
|
||||||
|
}
|
||||||
|
removeWorker(workerId) {
|
||||||
|
const worker = this._workers.get(workerId);
|
||||||
|
if (!worker)
|
||||||
|
return;
|
||||||
|
worker.didClose();
|
||||||
|
this._workers.delete(workerId);
|
||||||
|
}
|
||||||
|
clearWorkers() {
|
||||||
|
for (const [workerId, worker] of this._workers) {
|
||||||
|
worker.didClose();
|
||||||
|
this._workers.delete(workerId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async setFileChooserInterceptedBy(enabled, by) {
|
||||||
|
const wasIntercepted = this.fileChooserIntercepted();
|
||||||
|
if (enabled)
|
||||||
|
this._fileChooserInterceptedBy.add(by);
|
||||||
|
else
|
||||||
|
this._fileChooserInterceptedBy.delete(by);
|
||||||
|
if (wasIntercepted !== this.fileChooserIntercepted())
|
||||||
|
await this.delegate.updateFileChooserInterception();
|
||||||
|
}
|
||||||
|
fileChooserIntercepted() {
|
||||||
|
return this._fileChooserInterceptedBy.size > 0;
|
||||||
|
}
|
||||||
|
frameNavigatedToNewDocument(frame) {
|
||||||
|
this.emit(Page.Events.InternalFrameNavigatedToNewDocument, frame);
|
||||||
|
this.browserContext.emit(import_browserContext.BrowserContext.Events.InternalFrameNavigatedToNewDocument, frame, this);
|
||||||
|
const origin = frame.origin();
|
||||||
|
if (origin)
|
||||||
|
this.browserContext.addVisitedOrigin(origin);
|
||||||
|
if (frame === this.mainFrame()) {
|
||||||
|
if (this._consoleMessages.length > 0)
|
||||||
|
this._consoleMessages[this._consoleMessages.length - 1][navigationMarkSymbol] = true;
|
||||||
|
if (this._pageErrors.length > 0)
|
||||||
|
this._pageErrors[this._pageErrors.length - 1][navigationMarkSymbol] = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
allInitScripts() {
|
||||||
|
const bindings = [...this.browserContext._pageBindings.values(), ...this._pageBindings.values()].map((binding) => binding.initScript);
|
||||||
|
if (this.browserContext.bindingsInitScript)
|
||||||
|
bindings.unshift(this.browserContext.bindingsInitScript);
|
||||||
|
return [...bindings, ...this.browserContext.initScripts, ...this.initScripts];
|
||||||
|
}
|
||||||
|
getBinding(name) {
|
||||||
|
return this._pageBindings.get(name) || this.browserContext._pageBindings.get(name);
|
||||||
|
}
|
||||||
|
async safeNonStallingEvaluateInAllFrames(expression, world, options = {}) {
|
||||||
|
await Promise.all(this.frames().map(async (frame) => {
|
||||||
|
try {
|
||||||
|
await frame.nonStallingEvaluateInExistingContext(expression, world);
|
||||||
|
} catch (e) {
|
||||||
|
if (options.throwOnJSErrors && js.isJavaScriptErrorInEvaluate(e))
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
async hideHighlight() {
|
||||||
|
await Promise.all(this.frames().map((frame) => frame.hideHighlight().catch(() => {
|
||||||
|
})));
|
||||||
|
}
|
||||||
|
async setDockTile(image) {
|
||||||
|
await this.delegate.setDockTile(image);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const WorkerEvent = {
|
||||||
|
Close: "close"
|
||||||
|
};
|
||||||
|
class Worker extends import_instrumentation.SdkObject {
|
||||||
|
constructor(parent, url) {
|
||||||
|
super(parent, "worker");
|
||||||
|
this._executionContextPromise = new import_manualPromise.ManualPromise();
|
||||||
|
this._workerScriptLoaded = false;
|
||||||
|
this.existingExecutionContext = null;
|
||||||
|
this.openScope = new import_utils.LongStandingScope();
|
||||||
|
this.url = url;
|
||||||
|
}
|
||||||
|
static {
|
||||||
|
this.Events = WorkerEvent;
|
||||||
|
}
|
||||||
|
createExecutionContext(delegate) {
|
||||||
|
this.existingExecutionContext = new js.ExecutionContext(this, delegate, "worker");
|
||||||
|
if (this._workerScriptLoaded)
|
||||||
|
this._executionContextPromise.resolve(this.existingExecutionContext);
|
||||||
|
return this.existingExecutionContext;
|
||||||
|
}
|
||||||
|
workerScriptLoaded() {
|
||||||
|
this._workerScriptLoaded = true;
|
||||||
|
if (this.existingExecutionContext)
|
||||||
|
this._executionContextPromise.resolve(this.existingExecutionContext);
|
||||||
|
}
|
||||||
|
_prepareContextForRestart() {
|
||||||
|
if (this.existingExecutionContext)
|
||||||
|
this.existingExecutionContext.contextDestroyed("Service worker restarted");
|
||||||
|
this.existingExecutionContext = null;
|
||||||
|
this._workerScriptLoaded = false;
|
||||||
|
this._executionContextPromise = new import_manualPromise.ManualPromise();
|
||||||
|
}
|
||||||
|
didClose() {
|
||||||
|
if (this.existingExecutionContext)
|
||||||
|
this.existingExecutionContext.contextDestroyed("Worker was closed");
|
||||||
|
this.emit(Worker.Events.Close, this);
|
||||||
|
this.openScope.close(new Error("Worker closed"));
|
||||||
|
}
|
||||||
|
async evaluateExpression(expression, isFunction, arg) {
|
||||||
|
return js.evaluateExpression(await this._executionContextPromise, expression, { returnByValue: true, isFunction }, arg);
|
||||||
|
}
|
||||||
|
async evaluateExpressionHandle(expression, isFunction, arg) {
|
||||||
|
return js.evaluateExpression(await this._executionContextPromise, expression, { returnByValue: false, isFunction }, arg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class PageBinding extends import_disposable.DisposableObject {
|
||||||
|
static {
|
||||||
|
this.kController = "__playwright__binding__controller__";
|
||||||
|
}
|
||||||
|
static {
|
||||||
|
this.kBindingName = "__playwright__binding__";
|
||||||
|
}
|
||||||
|
static createInitScript(browserContext) {
|
||||||
|
return new InitScript(browserContext, `
|
||||||
|
(() => {
|
||||||
|
const module = {};
|
||||||
|
${rawBindingsControllerSource.source}
|
||||||
|
const property = '${PageBinding.kController}';
|
||||||
|
if (!globalThis[property])
|
||||||
|
globalThis[property] = new (module.exports.BindingsController())(globalThis, '${PageBinding.kBindingName}');
|
||||||
|
})();
|
||||||
|
`);
|
||||||
|
}
|
||||||
|
constructor(parent, name, playwrightFunction, needsHandle) {
|
||||||
|
super(parent);
|
||||||
|
this.name = name;
|
||||||
|
this.playwrightFunction = playwrightFunction;
|
||||||
|
this.initScript = new InitScript(parent, `globalThis['${PageBinding.kController}'].addBinding(${JSON.stringify(name)}, ${needsHandle})`);
|
||||||
|
this.needsHandle = needsHandle;
|
||||||
|
this.cleanupScript = `globalThis['${PageBinding.kController}'].removeBinding(${JSON.stringify(name)})`;
|
||||||
|
}
|
||||||
|
static async dispatch(page, payload, context) {
|
||||||
|
const { name, seq, serializedArgs } = JSON.parse(payload);
|
||||||
|
try {
|
||||||
|
(0, import_utils.assert)(context.world);
|
||||||
|
const binding = page.getBinding(name);
|
||||||
|
if (!binding)
|
||||||
|
throw new Error(`Function "${name}" is not exposed`);
|
||||||
|
let result;
|
||||||
|
if (binding.needsHandle) {
|
||||||
|
const handle = await context.evaluateExpressionHandle(`arg => globalThis['${PageBinding.kController}'].takeBindingHandle(arg)`, { isFunction: true }, { name, seq }).catch((e) => null);
|
||||||
|
result = await binding.playwrightFunction({ frame: context.frame, page, context: page.browserContext }, handle);
|
||||||
|
} else {
|
||||||
|
if (!Array.isArray(serializedArgs))
|
||||||
|
throw new Error(`serializedArgs is not an array. This can happen when Array.prototype.toJSON is defined incorrectly`);
|
||||||
|
const args = serializedArgs.map((a) => (0, import_utilityScriptSerializers.parseEvaluationResultValue)(a));
|
||||||
|
result = await binding.playwrightFunction({ frame: context.frame, page, context: page.browserContext }, ...args);
|
||||||
|
}
|
||||||
|
context.evaluateExpressionHandle(`arg => globalThis['${PageBinding.kController}'].deliverBindingResult(arg)`, { isFunction: true }, { name, seq, result }).catch((e) => import_debugLogger.debugLogger.log("error", e));
|
||||||
|
} catch (error) {
|
||||||
|
context.evaluateExpressionHandle(`arg => globalThis['${PageBinding.kController}'].deliverBindingResult(arg)`, { isFunction: true }, { name, seq, error }).catch((e) => import_debugLogger.debugLogger.log("error", e));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async dispose() {
|
||||||
|
await this.parent.removeExposedBinding(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class InitScript extends import_disposable.DisposableObject {
|
||||||
|
constructor(owner, source) {
|
||||||
|
super(owner);
|
||||||
|
this.source = `(() => {
|
||||||
|
${source}
|
||||||
|
})();`;
|
||||||
|
}
|
||||||
|
async dispose() {
|
||||||
|
await this.parent.removeInitScript(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function ariaSnapshotForFrame(progress, frame, options = {}) {
|
||||||
|
const snapshot = await frame.retryWithProgressAndTimeouts(progress, [1e3, 2e3, 4e3, 8e3], async (continuePolling) => {
|
||||||
|
try {
|
||||||
|
const context = await progress.race(frame._utilityContext());
|
||||||
|
const injectedScript = await progress.race(context.injectedScript());
|
||||||
|
const snapshotOrRetry = await progress.race(injectedScript.evaluate((injected, options2) => {
|
||||||
|
if (options2.info) {
|
||||||
|
const element = injected.querySelector(options2.info.parsed, injected.document, options2.info.strict);
|
||||||
|
if (!element)
|
||||||
|
return false;
|
||||||
|
return injected.incrementalAriaSnapshot(element, options2);
|
||||||
|
}
|
||||||
|
const node = injected.document.body;
|
||||||
|
if (!node)
|
||||||
|
return true;
|
||||||
|
return injected.incrementalAriaSnapshot(node, options2);
|
||||||
|
}, {
|
||||||
|
mode: options.mode ?? "default",
|
||||||
|
refPrefix: frame.seq ? "f" + frame.seq : "",
|
||||||
|
track: options.track,
|
||||||
|
doNotRenderActive: options.doNotRenderActive,
|
||||||
|
info: options.info,
|
||||||
|
depth: options.depth
|
||||||
|
}));
|
||||||
|
if (snapshotOrRetry === true)
|
||||||
|
return continuePolling;
|
||||||
|
if (snapshotOrRetry === false)
|
||||||
|
throw new import_dom.NonRecoverableDOMError(`Selector "${(0, import_selectorParser.stringifySelector)(options.info.parsed)}" does not match any element`);
|
||||||
|
return snapshotOrRetry;
|
||||||
|
} catch (e) {
|
||||||
|
if (frame.isNonRetriableError(e))
|
||||||
|
throw e;
|
||||||
|
return continuePolling;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
const renderedIframeRefs = snapshot.iframeRefs.filter((ref) => ref in snapshot.iframeDepths);
|
||||||
|
const childSnapshotPromises = renderedIframeRefs.map((ref) => {
|
||||||
|
const iframeDepth = snapshot.iframeDepths[ref];
|
||||||
|
const childDepth = options.depth ? options.depth - iframeDepth - 1 : void 0;
|
||||||
|
return ariaSnapshotFrameRef(progress, frame, ref, { ...options, depth: childDepth });
|
||||||
|
});
|
||||||
|
const childSnapshots = await Promise.all(childSnapshotPromises);
|
||||||
|
const full = [];
|
||||||
|
let incremental;
|
||||||
|
if (snapshot.incremental !== void 0) {
|
||||||
|
incremental = snapshot.incremental.split("\n");
|
||||||
|
for (let i = 0; i < renderedIframeRefs.length; i++) {
|
||||||
|
const childSnapshot = childSnapshots[i];
|
||||||
|
if (childSnapshot.incremental)
|
||||||
|
incremental.push(...childSnapshot.incremental);
|
||||||
|
else if (childSnapshot.full.length)
|
||||||
|
incremental.push("- <changed> iframe [ref=" + renderedIframeRefs[i] + "]:", ...childSnapshot.full.map((l) => " " + l));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for (const line of snapshot.full.split("\n")) {
|
||||||
|
const match = line.match(/^(\s*)- iframe (?:\[active\] )?\[ref=([^\]]*)\]/);
|
||||||
|
if (!match) {
|
||||||
|
full.push(line);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const leadingSpace = match[1];
|
||||||
|
const ref = match[2];
|
||||||
|
const childSnapshot = childSnapshots[renderedIframeRefs.indexOf(ref)] ?? { full: [] };
|
||||||
|
full.push(childSnapshot.full.length ? line + ":" : line);
|
||||||
|
full.push(...childSnapshot.full.map((l) => leadingSpace + " " + l));
|
||||||
|
}
|
||||||
|
return { full, incremental };
|
||||||
|
}
|
||||||
|
async function ariaSnapshotFrameRef(progress, parentFrame, frameRef, options) {
|
||||||
|
const frameSelector = `aria-ref=${frameRef} >> internal:control=enter-frame`;
|
||||||
|
const frameBodySelector = `${frameSelector} >> body`;
|
||||||
|
const child = await progress.race(parentFrame.selectors.resolveFrameForSelector(frameBodySelector, { strict: true }));
|
||||||
|
if (!child)
|
||||||
|
return { full: [] };
|
||||||
|
try {
|
||||||
|
return await ariaSnapshotForFrame(progress, child.frame, { ...options, info: void 0 });
|
||||||
|
} catch {
|
||||||
|
return { full: [] };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function ensureArrayLimit(array, limit) {
|
||||||
|
if (array.length > limit)
|
||||||
|
return array.splice(0, limit / 10);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
InitScript,
|
||||||
|
Page,
|
||||||
|
PageBinding,
|
||||||
|
Worker,
|
||||||
|
WorkerEvent,
|
||||||
|
ariaSnapshotForFrame
|
||||||
|
});
|
||||||
@ -0,0 +1,22 @@
|
|||||||
|
import type { ColumnBuilderBaseConfig } from "../../column-builder.cjs";
|
||||||
|
import type { ColumnBaseConfig } from "../../column.cjs";
|
||||||
|
import { entityKind } from "../../entity.cjs";
|
||||||
|
import { GelColumn, GelColumnBuilder } from "./common.cjs";
|
||||||
|
export type GelUUIDBuilderInitial<TName extends string> = GelUUIDBuilder<{
|
||||||
|
name: TName;
|
||||||
|
dataType: 'string';
|
||||||
|
columnType: 'GelUUID';
|
||||||
|
data: string;
|
||||||
|
driverParam: string;
|
||||||
|
enumValues: undefined;
|
||||||
|
}>;
|
||||||
|
export declare class GelUUIDBuilder<T extends ColumnBuilderBaseConfig<'string', 'GelUUID'>> extends GelColumnBuilder<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
constructor(name: T['name']);
|
||||||
|
}
|
||||||
|
export declare class GelUUID<T extends ColumnBaseConfig<'string', 'GelUUID'>> extends GelColumn<T> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
getSQLType(): string;
|
||||||
|
}
|
||||||
|
export declare function uuid(): GelUUIDBuilderInitial<''>;
|
||||||
|
export declare function uuid<TName extends string>(name: TName): GelUUIDBuilderInitial<TName>;
|
||||||
@ -0,0 +1,113 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var table_exports = {};
|
||||||
|
__export(table_exports, {
|
||||||
|
BaseName: () => BaseName,
|
||||||
|
Columns: () => Columns,
|
||||||
|
ExtraConfigBuilder: () => ExtraConfigBuilder,
|
||||||
|
ExtraConfigColumns: () => ExtraConfigColumns,
|
||||||
|
IsAlias: () => IsAlias,
|
||||||
|
OriginalName: () => OriginalName,
|
||||||
|
Schema: () => Schema,
|
||||||
|
Table: () => Table,
|
||||||
|
getTableName: () => getTableName,
|
||||||
|
getTableUniqueName: () => getTableUniqueName,
|
||||||
|
isTable: () => isTable
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(table_exports);
|
||||||
|
var import_entity = require("./entity.cjs");
|
||||||
|
var import_table_utils = require("./table.utils.cjs");
|
||||||
|
const Schema = Symbol.for("drizzle:Schema");
|
||||||
|
const Columns = Symbol.for("drizzle:Columns");
|
||||||
|
const ExtraConfigColumns = Symbol.for("drizzle:ExtraConfigColumns");
|
||||||
|
const OriginalName = Symbol.for("drizzle:OriginalName");
|
||||||
|
const BaseName = Symbol.for("drizzle:BaseName");
|
||||||
|
const IsAlias = Symbol.for("drizzle:IsAlias");
|
||||||
|
const ExtraConfigBuilder = Symbol.for("drizzle:ExtraConfigBuilder");
|
||||||
|
const IsDrizzleTable = Symbol.for("drizzle:IsDrizzleTable");
|
||||||
|
class Table {
|
||||||
|
static [import_entity.entityKind] = "Table";
|
||||||
|
/** @internal */
|
||||||
|
static Symbol = {
|
||||||
|
Name: import_table_utils.TableName,
|
||||||
|
Schema,
|
||||||
|
OriginalName,
|
||||||
|
Columns,
|
||||||
|
ExtraConfigColumns,
|
||||||
|
BaseName,
|
||||||
|
IsAlias,
|
||||||
|
ExtraConfigBuilder
|
||||||
|
};
|
||||||
|
/**
|
||||||
|
* @internal
|
||||||
|
* Can be changed if the table is aliased.
|
||||||
|
*/
|
||||||
|
[import_table_utils.TableName];
|
||||||
|
/**
|
||||||
|
* @internal
|
||||||
|
* Used to store the original name of the table, before any aliasing.
|
||||||
|
*/
|
||||||
|
[OriginalName];
|
||||||
|
/** @internal */
|
||||||
|
[Schema];
|
||||||
|
/** @internal */
|
||||||
|
[Columns];
|
||||||
|
/** @internal */
|
||||||
|
[ExtraConfigColumns];
|
||||||
|
/**
|
||||||
|
* @internal
|
||||||
|
* Used to store the table name before the transformation via the `tableCreator` functions.
|
||||||
|
*/
|
||||||
|
[BaseName];
|
||||||
|
/** @internal */
|
||||||
|
[IsAlias] = false;
|
||||||
|
/** @internal */
|
||||||
|
[IsDrizzleTable] = true;
|
||||||
|
/** @internal */
|
||||||
|
[ExtraConfigBuilder] = void 0;
|
||||||
|
constructor(name, schema, baseName) {
|
||||||
|
this[import_table_utils.TableName] = this[OriginalName] = name;
|
||||||
|
this[Schema] = schema;
|
||||||
|
this[BaseName] = baseName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function isTable(table) {
|
||||||
|
return typeof table === "object" && table !== null && IsDrizzleTable in table;
|
||||||
|
}
|
||||||
|
function getTableName(table) {
|
||||||
|
return table[import_table_utils.TableName];
|
||||||
|
}
|
||||||
|
function getTableUniqueName(table) {
|
||||||
|
return `${table[Schema] ?? "public"}.${table[import_table_utils.TableName]}`;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
BaseName,
|
||||||
|
Columns,
|
||||||
|
ExtraConfigBuilder,
|
||||||
|
ExtraConfigColumns,
|
||||||
|
IsAlias,
|
||||||
|
OriginalName,
|
||||||
|
Schema,
|
||||||
|
Table,
|
||||||
|
getTableName,
|
||||||
|
getTableUniqueName,
|
||||||
|
isTable
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=table.cjs.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"ColdObservable.js","sourceRoot":"","sources":["../../../../src/internal/testing/ColdObservable.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;AAAA,4CAA2C;AAC3C,gDAA+C;AAI/C,+DAA8D;AAC9D,mDAAkD;AAElD,gDAAsD;AAEtD;IAAuC,kCAAa;IAQlD,wBAAmB,QAAuB,EAAE,SAAoB;QAAhE,YACE,kBAAM,UAA+B,UAA2B;YAC9D,IAAM,UAAU,GAAsB,IAAW,CAAC;YAClD,IAAM,KAAK,GAAG,UAAU,CAAC,kBAAkB,EAAE,CAAC;YAC9C,IAAM,YAAY,GAAG,IAAI,2BAAY,EAAE,CAAC;YACxC,YAAY,CAAC,GAAG,CACd,IAAI,2BAAY,CAAC;gBACf,UAAU,CAAC,oBAAoB,CAAC,KAAK,CAAC,CAAC;YACzC,CAAC,CAAC,CACH,CAAC;YACF,UAAU,CAAC,gBAAgB,CAAC,UAAU,CAAC,CAAC;YACxC,OAAO,YAAY,CAAC;QACtB,CAAC,CAAC,SAEH;QAdkB,cAAQ,GAAR,QAAQ,CAAe;QAPnC,mBAAa,GAAsB,EAAE,CAAC;QAoB3C,KAAI,CAAC,SAAS,GAAG,SAAS,CAAC;;IAC7B,CAAC;IAED,yCAAgB,GAAhB,UAAiB,UAA2B;QAC1C,IAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC;QAC5C,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,cAAc,EAAE,CAAC,EAAE,EAAE;YACvC,IAAM,OAAO,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YACjC,UAAU,CAAC,GAAG,CACZ,IAAI,CAAC,SAAS,CAAC,QAAQ,CACrB,UAAC,KAAK;gBACE,IAAA,KAAyD,KAAM,EAAlD,YAAY,0BAAA,EAAgB,WAAW,gBAAW,CAAC;gBACtE,kCAAmB,CAAC,YAAY,EAAE,WAAW,CAAC,CAAC;YACjD,CAAC,EACD,OAAO,CAAC,KAAK,EACb,EAAE,OAAO,SAAA,EAAE,UAAU,YAAA,EAAE,CACxB,CACF,CAAC;SACH;IACH,CAAC;IACH,qBAAC;AAAD,CAAC,AAxCD,CAAuC,uBAAU,GAwChD;AAxCY,wCAAc;AAyC3B,yBAAW,CAAC,cAAc,EAAE,CAAC,2CAAoB,CAAC,CAAC,CAAC"}
|
||||||
@ -0,0 +1,6 @@
|
|||||||
|
import { scanInternals } from './scanInternals';
|
||||||
|
import { operate } from '../util/lift';
|
||||||
|
export function reduce(accumulator, seed) {
|
||||||
|
return operate(scanInternals(accumulator, seed, arguments.length >= 2, false, true));
|
||||||
|
}
|
||||||
|
//# sourceMappingURL=reduce.js.map
|
||||||
@ -0,0 +1,22 @@
|
|||||||
|
import { Scheduler } from '../Scheduler';
|
||||||
|
import { SubscriptionLog } from './SubscriptionLog';
|
||||||
|
|
||||||
|
export class SubscriptionLoggable {
|
||||||
|
public subscriptions: SubscriptionLog[] = [];
|
||||||
|
// @ts-ignore: Property has no initializer and is not definitely assigned
|
||||||
|
scheduler: Scheduler;
|
||||||
|
|
||||||
|
logSubscribedFrame(): number {
|
||||||
|
this.subscriptions.push(new SubscriptionLog(this.scheduler.now()));
|
||||||
|
return this.subscriptions.length - 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
logUnsubscribedFrame(index: number) {
|
||||||
|
const subscriptionLogs = this.subscriptions;
|
||||||
|
const oldSubscriptionLog = subscriptionLogs[index];
|
||||||
|
subscriptionLogs[index] = new SubscriptionLog(
|
||||||
|
oldSubscriptionLog.subscribedFrame,
|
||||||
|
this.scheduler.now()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
@ -0,0 +1,150 @@
|
|||||||
|
import type { BuildColumns } from "../column-builder.cjs";
|
||||||
|
import { entityKind } from "../entity.cjs";
|
||||||
|
import type { TypedQueryBuilder } from "../query-builders/query-builder.cjs";
|
||||||
|
import type { AddAliasToSelection } from "../query-builders/select.types.cjs";
|
||||||
|
import type { ColumnsSelection, SQL } from "../sql/sql.cjs";
|
||||||
|
import type { RequireAtLeastOne } from "../utils.cjs";
|
||||||
|
import type { GelColumnBuilderBase } from "./columns/common.cjs";
|
||||||
|
import { QueryBuilder } from "./query-builders/query-builder.cjs";
|
||||||
|
import { GelViewBase } from "./view-base.cjs";
|
||||||
|
import { GelViewConfig } from "./view-common.cjs";
|
||||||
|
export type ViewWithConfig = RequireAtLeastOne<{
|
||||||
|
checkOption: 'local' | 'cascaded';
|
||||||
|
securityBarrier: boolean;
|
||||||
|
securityInvoker: boolean;
|
||||||
|
}>;
|
||||||
|
export declare class DefaultViewBuilderCore<TConfig extends {
|
||||||
|
name: string;
|
||||||
|
columns?: unknown;
|
||||||
|
}> {
|
||||||
|
protected name: TConfig['name'];
|
||||||
|
protected schema: string | undefined;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
readonly _: {
|
||||||
|
readonly name: TConfig['name'];
|
||||||
|
readonly columns: TConfig['columns'];
|
||||||
|
};
|
||||||
|
constructor(name: TConfig['name'], schema: string | undefined);
|
||||||
|
protected config: {
|
||||||
|
with?: ViewWithConfig;
|
||||||
|
};
|
||||||
|
with(config: ViewWithConfig): this;
|
||||||
|
}
|
||||||
|
export declare class ViewBuilder<TName extends string = string> extends DefaultViewBuilderCore<{
|
||||||
|
name: TName;
|
||||||
|
}> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
as<TSelectedFields extends ColumnsSelection>(qb: TypedQueryBuilder<TSelectedFields> | ((qb: QueryBuilder) => TypedQueryBuilder<TSelectedFields>)): GelViewWithSelection<TName, false, AddAliasToSelection<TSelectedFields, TName, 'gel'>>;
|
||||||
|
}
|
||||||
|
export declare class ManualViewBuilder<TName extends string = string, TColumns extends Record<string, GelColumnBuilderBase> = Record<string, GelColumnBuilderBase>> extends DefaultViewBuilderCore<{
|
||||||
|
name: TName;
|
||||||
|
columns: TColumns;
|
||||||
|
}> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
private columns;
|
||||||
|
constructor(name: TName, columns: TColumns, schema: string | undefined);
|
||||||
|
existing(): GelViewWithSelection<TName, true, BuildColumns<TName, TColumns, 'gel'>>;
|
||||||
|
as(query: SQL): GelViewWithSelection<TName, false, BuildColumns<TName, TColumns, 'gel'>>;
|
||||||
|
}
|
||||||
|
export type GelMaterializedViewWithConfig = RequireAtLeastOne<{
|
||||||
|
fillfactor: number;
|
||||||
|
toastTupleTarget: number;
|
||||||
|
parallelWorkers: number;
|
||||||
|
autovacuumEnabled: boolean;
|
||||||
|
vacuumIndexCleanup: 'auto' | 'off' | 'on';
|
||||||
|
vacuumTruncate: boolean;
|
||||||
|
autovacuumVacuumThreshold: number;
|
||||||
|
autovacuumVacuumScaleFactor: number;
|
||||||
|
autovacuumVacuumCostDelay: number;
|
||||||
|
autovacuumVacuumCostLimit: number;
|
||||||
|
autovacuumFreezeMinAge: number;
|
||||||
|
autovacuumFreezeMaxAge: number;
|
||||||
|
autovacuumFreezeTableAge: number;
|
||||||
|
autovacuumMultixactFreezeMinAge: number;
|
||||||
|
autovacuumMultixactFreezeMaxAge: number;
|
||||||
|
autovacuumMultixactFreezeTableAge: number;
|
||||||
|
logAutovacuumMinDuration: number;
|
||||||
|
userCatalogTable: boolean;
|
||||||
|
}>;
|
||||||
|
export declare class MaterializedViewBuilderCore<TConfig extends {
|
||||||
|
name: string;
|
||||||
|
columns?: unknown;
|
||||||
|
}> {
|
||||||
|
protected name: TConfig['name'];
|
||||||
|
protected schema: string | undefined;
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
_: {
|
||||||
|
readonly name: TConfig['name'];
|
||||||
|
readonly columns: TConfig['columns'];
|
||||||
|
};
|
||||||
|
constructor(name: TConfig['name'], schema: string | undefined);
|
||||||
|
protected config: {
|
||||||
|
with?: GelMaterializedViewWithConfig;
|
||||||
|
using?: string;
|
||||||
|
tablespace?: string;
|
||||||
|
withNoData?: boolean;
|
||||||
|
};
|
||||||
|
using(using: string): this;
|
||||||
|
with(config: GelMaterializedViewWithConfig): this;
|
||||||
|
tablespace(tablespace: string): this;
|
||||||
|
withNoData(): this;
|
||||||
|
}
|
||||||
|
export declare class MaterializedViewBuilder<TName extends string = string> extends MaterializedViewBuilderCore<{
|
||||||
|
name: TName;
|
||||||
|
}> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
as<TSelectedFields extends ColumnsSelection>(qb: TypedQueryBuilder<TSelectedFields> | ((qb: QueryBuilder) => TypedQueryBuilder<TSelectedFields>)): GelMaterializedViewWithSelection<TName, false, AddAliasToSelection<TSelectedFields, TName, 'gel'>>;
|
||||||
|
}
|
||||||
|
export declare class ManualMaterializedViewBuilder<TName extends string = string, TColumns extends Record<string, GelColumnBuilderBase> = Record<string, GelColumnBuilderBase>> extends MaterializedViewBuilderCore<{
|
||||||
|
name: TName;
|
||||||
|
columns: TColumns;
|
||||||
|
}> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
private columns;
|
||||||
|
constructor(name: TName, columns: TColumns, schema: string | undefined);
|
||||||
|
existing(): GelMaterializedViewWithSelection<TName, true, BuildColumns<TName, TColumns, 'gel'>>;
|
||||||
|
as(query: SQL): GelMaterializedViewWithSelection<TName, false, BuildColumns<TName, TColumns, 'gel'>>;
|
||||||
|
}
|
||||||
|
export declare class GelView<TName extends string = string, TExisting extends boolean = boolean, TSelectedFields extends ColumnsSelection = ColumnsSelection> extends GelViewBase<TName, TExisting, TSelectedFields> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
[GelViewConfig]: {
|
||||||
|
with?: ViewWithConfig;
|
||||||
|
} | undefined;
|
||||||
|
constructor({ GelConfig, config }: {
|
||||||
|
GelConfig: {
|
||||||
|
with?: ViewWithConfig;
|
||||||
|
} | undefined;
|
||||||
|
config: {
|
||||||
|
name: TName;
|
||||||
|
schema: string | undefined;
|
||||||
|
selectedFields: ColumnsSelection;
|
||||||
|
query: SQL | undefined;
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
export type GelViewWithSelection<TName extends string = string, TExisting extends boolean = boolean, TSelectedFields extends ColumnsSelection = ColumnsSelection> = GelView<TName, TExisting, TSelectedFields> & TSelectedFields;
|
||||||
|
export declare const GelMaterializedViewConfig: unique symbol;
|
||||||
|
export declare class GelMaterializedView<TName extends string = string, TExisting extends boolean = boolean, TSelectedFields extends ColumnsSelection = ColumnsSelection> extends GelViewBase<TName, TExisting, TSelectedFields> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
readonly [GelMaterializedViewConfig]: {
|
||||||
|
readonly with?: GelMaterializedViewWithConfig;
|
||||||
|
readonly using?: string;
|
||||||
|
readonly tablespace?: string;
|
||||||
|
readonly withNoData?: boolean;
|
||||||
|
} | undefined;
|
||||||
|
constructor({ GelConfig, config }: {
|
||||||
|
GelConfig: {
|
||||||
|
with: GelMaterializedViewWithConfig | undefined;
|
||||||
|
using: string | undefined;
|
||||||
|
tablespace: string | undefined;
|
||||||
|
withNoData: boolean | undefined;
|
||||||
|
} | undefined;
|
||||||
|
config: {
|
||||||
|
name: TName;
|
||||||
|
schema: string | undefined;
|
||||||
|
selectedFields: ColumnsSelection;
|
||||||
|
query: SQL | undefined;
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
export type GelMaterializedViewWithSelection<TName extends string = string, TExisting extends boolean = boolean, TSelectedFields extends ColumnsSelection = ColumnsSelection> = GelMaterializedView<TName, TExisting, TSelectedFields> & TSelectedFields;
|
||||||
@ -0,0 +1,69 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var playwright_exports = {};
|
||||||
|
__export(playwright_exports, {
|
||||||
|
Playwright: () => Playwright,
|
||||||
|
createPlaywright: () => createPlaywright
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(playwright_exports);
|
||||||
|
var import_android = require("./android/android");
|
||||||
|
var import_backendAdb = require("./android/backendAdb");
|
||||||
|
var import_bidiChromium = require("./bidi/bidiChromium");
|
||||||
|
var import_bidiFirefox = require("./bidi/bidiFirefox");
|
||||||
|
var import_chromium = require("./chromium/chromium");
|
||||||
|
var import_debugController = require("./debugController");
|
||||||
|
var import_electron = require("./electron/electron");
|
||||||
|
var import_firefox = require("./firefox/firefox");
|
||||||
|
var import_instrumentation = require("./instrumentation");
|
||||||
|
var import_webkit = require("./webkit/webkit");
|
||||||
|
class Playwright extends import_instrumentation.SdkObject {
|
||||||
|
constructor(options) {
|
||||||
|
super((0, import_instrumentation.createRootSdkObject)(), void 0, "Playwright");
|
||||||
|
this._allPages = /* @__PURE__ */ new Set();
|
||||||
|
this._allBrowsers = /* @__PURE__ */ new Set();
|
||||||
|
this.options = options;
|
||||||
|
this.attribution.playwright = this;
|
||||||
|
this.instrumentation.addListener({
|
||||||
|
onBrowserOpen: (browser) => this._allBrowsers.add(browser),
|
||||||
|
onBrowserClose: (browser) => this._allBrowsers.delete(browser),
|
||||||
|
onPageOpen: (page) => this._allPages.add(page),
|
||||||
|
onPageClose: (page) => this._allPages.delete(page)
|
||||||
|
}, null);
|
||||||
|
this.chromium = new import_chromium.Chromium(this, new import_bidiChromium.BidiChromium(this));
|
||||||
|
this.firefox = new import_firefox.Firefox(this, new import_bidiFirefox.BidiFirefox(this));
|
||||||
|
this.webkit = new import_webkit.WebKit(this);
|
||||||
|
this.electron = new import_electron.Electron(this);
|
||||||
|
this.android = new import_android.Android(this, new import_backendAdb.AdbBackend());
|
||||||
|
this.debugController = new import_debugController.DebugController(this);
|
||||||
|
}
|
||||||
|
allBrowsers() {
|
||||||
|
return [...this._allBrowsers];
|
||||||
|
}
|
||||||
|
allPages() {
|
||||||
|
return [...this._allPages];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function createPlaywright(options) {
|
||||||
|
return new Playwright(options);
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
Playwright,
|
||||||
|
createPlaywright
|
||||||
|
});
|
||||||
@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) Hiroki Osame <hiroki.osame@gmail.com>
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
@ -0,0 +1,7 @@
|
|||||||
|
export * from "./blob.js";
|
||||||
|
export * from "./common.js";
|
||||||
|
export * from "./custom.js";
|
||||||
|
export * from "./integer.js";
|
||||||
|
export * from "./numeric.js";
|
||||||
|
export * from "./real.js";
|
||||||
|
export * from "./text.js";
|
||||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,4 @@
|
|||||||
|
import type { BuildAliasTable } from "./query-builders/select.types.cjs";
|
||||||
|
import type { GelTable } from "./table.cjs";
|
||||||
|
import type { GelViewBase } from "./view-base.cjs";
|
||||||
|
export declare function alias<TTable extends GelTable | GelViewBase, TAlias extends string>(table: TTable, alias: TAlias): BuildAliasTable<TTable, TAlias>;
|
||||||
@ -0,0 +1 @@
|
|||||||
|
export let nanoid=(t=21)=>crypto.getRandomValues(new Uint8Array(t)).reduce(((t,e)=>t+=(e&=63)<36?e.toString(36):e<62?(e-26).toString(36).toUpperCase():e<63?"_":"-"),"");
|
||||||
@ -0,0 +1,86 @@
|
|||||||
|
import Container, { ContainerProps } from './container.js'
|
||||||
|
import Document from './document.js'
|
||||||
|
import { ProcessOptions } from './postcss.js'
|
||||||
|
import Result from './result.js'
|
||||||
|
|
||||||
|
declare namespace Root {
|
||||||
|
export interface RootRaws extends Record<string, any> {
|
||||||
|
/**
|
||||||
|
* The space symbols after the last child to the end of file.
|
||||||
|
*/
|
||||||
|
after?: string
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Non-CSS code after `Root`, when `Root` is inside `Document`.
|
||||||
|
*
|
||||||
|
* **Experimental:** some aspects of this node could change within minor
|
||||||
|
* or patch version releases.
|
||||||
|
*/
|
||||||
|
codeAfter?: string
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Non-CSS code before `Root`, when `Root` is inside `Document`.
|
||||||
|
*
|
||||||
|
* **Experimental:** some aspects of this node could change within minor
|
||||||
|
* or patch version releases.
|
||||||
|
*/
|
||||||
|
codeBefore?: string
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Is the last child has an (optional) semicolon.
|
||||||
|
*/
|
||||||
|
semicolon?: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RootProps extends ContainerProps {
|
||||||
|
/**
|
||||||
|
* Information used to generate byte-to-byte equal node string
|
||||||
|
* as it was in the origin input.
|
||||||
|
* */
|
||||||
|
raws?: RootRaws
|
||||||
|
}
|
||||||
|
|
||||||
|
export { Root_ as default }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Represents a CSS file and contains all its parsed nodes.
|
||||||
|
*
|
||||||
|
* ```js
|
||||||
|
* const root = postcss.parse('a{color:black} b{z-index:2}')
|
||||||
|
* root.type //=> 'root'
|
||||||
|
* root.nodes.length //=> 2
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
declare class Root_ extends Container {
|
||||||
|
nodes: NonNullable<Container['nodes']>
|
||||||
|
parent: Document | undefined
|
||||||
|
raws: Root.RootRaws
|
||||||
|
type: 'root'
|
||||||
|
|
||||||
|
constructor(defaults?: Root.RootProps)
|
||||||
|
|
||||||
|
assign(overrides: object | Root.RootProps): this
|
||||||
|
clone(overrides?: Partial<Root.RootProps>): this
|
||||||
|
cloneAfter(overrides?: Partial<Root.RootProps>): this
|
||||||
|
cloneBefore(overrides?: Partial<Root.RootProps>): this
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a `Result` instance representing the root’s CSS.
|
||||||
|
*
|
||||||
|
* ```js
|
||||||
|
* const root1 = postcss.parse(css1, { from: 'a.css' })
|
||||||
|
* const root2 = postcss.parse(css2, { from: 'b.css' })
|
||||||
|
* root1.append(root2)
|
||||||
|
* const result = root1.toResult({ to: 'all.css', map: true })
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* @param options Options.
|
||||||
|
* @return Result with current root’s CSS.
|
||||||
|
*/
|
||||||
|
toResult(options?: ProcessOptions): Result
|
||||||
|
}
|
||||||
|
|
||||||
|
declare class Root extends Root_ {}
|
||||||
|
|
||||||
|
export = Root
|
||||||
@ -0,0 +1,56 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { ObservedValueOf, ObservableInput } from '../types';
|
||||||
|
import { innerFrom } from './innerFrom';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates an Observable that, on subscribe, calls an Observable factory to
|
||||||
|
* make an Observable for each new Observer.
|
||||||
|
*
|
||||||
|
* <span class="informal">Creates the Observable lazily, that is, only when it
|
||||||
|
* is subscribed.
|
||||||
|
* </span>
|
||||||
|
*
|
||||||
|
* 
|
||||||
|
*
|
||||||
|
* `defer` allows you to create an Observable only when the Observer
|
||||||
|
* subscribes. It waits until an Observer subscribes to it, calls the given
|
||||||
|
* factory function to get an Observable -- where a factory function typically
|
||||||
|
* generates a new Observable -- and subscribes the Observer to this Observable.
|
||||||
|
* In case the factory function returns a falsy value, then EMPTY is used as
|
||||||
|
* Observable instead. Last but not least, an exception during the factory
|
||||||
|
* function call is transferred to the Observer by calling `error`.
|
||||||
|
*
|
||||||
|
* ## Example
|
||||||
|
*
|
||||||
|
* Subscribe to either an Observable of clicks or an Observable of interval, at random
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* import { defer, fromEvent, interval } from 'rxjs';
|
||||||
|
*
|
||||||
|
* const clicksOrInterval = defer(() => {
|
||||||
|
* return Math.random() > 0.5
|
||||||
|
* ? fromEvent(document, 'click')
|
||||||
|
* : interval(1000);
|
||||||
|
* });
|
||||||
|
* clicksOrInterval.subscribe(x => console.log(x));
|
||||||
|
*
|
||||||
|
* // Results in the following behavior:
|
||||||
|
* // If the result of Math.random() is greater than 0.5 it will listen
|
||||||
|
* // for clicks anywhere on the "document"; when document is clicked it
|
||||||
|
* // will log a MouseEvent object to the console. If the result is less
|
||||||
|
* // than 0.5 it will emit ascending numbers, one every second(1000ms).
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* @see {@link Observable}
|
||||||
|
*
|
||||||
|
* @param observableFactory The Observable factory function to invoke for each
|
||||||
|
* Observer that subscribes to the output Observable. May also return any
|
||||||
|
* `ObservableInput`, which will be converted on the fly to an Observable.
|
||||||
|
* @return An Observable whose Observers' subscriptions trigger an invocation of the
|
||||||
|
* given Observable factory function.
|
||||||
|
*/
|
||||||
|
export function defer<R extends ObservableInput<any>>(observableFactory: () => R): Observable<ObservedValueOf<R>> {
|
||||||
|
return new Observable<ObservedValueOf<R>>((subscriber) => {
|
||||||
|
innerFrom(observableFactory()).subscribe(subscriber);
|
||||||
|
});
|
||||||
|
}
|
||||||
@ -0,0 +1,10 @@
|
|||||||
|
declare function setToStringTag(
|
||||||
|
object: object & { [Symbol.toStringTag]?: unknown },
|
||||||
|
value: string | unknown,
|
||||||
|
options?: {
|
||||||
|
force?: boolean;
|
||||||
|
nonConfigurable?: boolean;
|
||||||
|
},
|
||||||
|
): void;
|
||||||
|
|
||||||
|
export = setToStringTag;
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../../src/mysql-core/columns/real.ts"],"sourcesContent":["import type { ColumnBuilderBaseConfig, ColumnBuilderRuntimeConfig, MakeColumnConfig } from '~/column-builder.ts';\nimport type { ColumnBaseConfig } from '~/column.ts';\nimport { entityKind } from '~/entity.ts';\nimport type { AnyMySqlTable } from '~/mysql-core/table.ts';\nimport { getColumnNameAndConfig } from '~/utils.ts';\nimport { MySqlColumnBuilderWithAutoIncrement, MySqlColumnWithAutoIncrement } from './common.ts';\n\nexport type MySqlRealBuilderInitial<TName extends string> = MySqlRealBuilder<{\n\tname: TName;\n\tdataType: 'number';\n\tcolumnType: 'MySqlReal';\n\tdata: number;\n\tdriverParam: number | string;\n\tenumValues: undefined;\n}>;\n\nexport class MySqlRealBuilder<T extends ColumnBuilderBaseConfig<'number', 'MySqlReal'>>\n\textends MySqlColumnBuilderWithAutoIncrement<\n\t\tT,\n\t\tMySqlRealConfig\n\t>\n{\n\tstatic override readonly [entityKind]: string = 'MySqlRealBuilder';\n\n\tconstructor(name: T['name'], config: MySqlRealConfig | undefined) {\n\t\tsuper(name, 'number', 'MySqlReal');\n\t\tthis.config.precision = config?.precision;\n\t\tthis.config.scale = config?.scale;\n\t}\n\n\t/** @internal */\n\toverride build<TTableName extends string>(\n\t\ttable: AnyMySqlTable<{ name: TTableName }>,\n\t): MySqlReal<MakeColumnConfig<T, TTableName>> {\n\t\treturn new MySqlReal<MakeColumnConfig<T, TTableName>>(table, this.config as ColumnBuilderRuntimeConfig<any, any>);\n\t}\n}\n\nexport class MySqlReal<T extends ColumnBaseConfig<'number', 'MySqlReal'>> extends MySqlColumnWithAutoIncrement<\n\tT,\n\tMySqlRealConfig\n> {\n\tstatic override readonly [entityKind]: string = 'MySqlReal';\n\n\tprecision: number | undefined = this.config.precision;\n\tscale: number | undefined = this.config.scale;\n\n\tgetSQLType(): string {\n\t\tif (this.precision !== undefined && this.scale !== undefined) {\n\t\t\treturn `real(${this.precision}, ${this.scale})`;\n\t\t} else if (this.precision === undefined) {\n\t\t\treturn 'real';\n\t\t} else {\n\t\t\treturn `real(${this.precision})`;\n\t\t}\n\t}\n}\n\nexport interface MySqlRealConfig {\n\tprecision?: number;\n\tscale?: number;\n}\n\nexport function real(): MySqlRealBuilderInitial<''>;\nexport function real(\n\tconfig?: MySqlRealConfig,\n): MySqlRealBuilderInitial<''>;\nexport function real<TName extends string>(\n\tname: TName,\n\tconfig?: MySqlRealConfig,\n): MySqlRealBuilderInitial<TName>;\nexport function real(a?: string | MySqlRealConfig, b: MySqlRealConfig = {}) {\n\tconst { name, config } = getColumnNameAndConfig<MySqlRealConfig>(a, b);\n\treturn new MySqlRealBuilder(name, config);\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAEA,oBAA2B;AAE3B,mBAAuC;AACvC,oBAAkF;AAW3E,MAAM,yBACJ,kDAIT;AAAA,EACC,QAA0B,wBAAU,IAAY;AAAA,EAEhD,YAAY,MAAiB,QAAqC;AACjE,UAAM,MAAM,UAAU,WAAW;AACjC,SAAK,OAAO,YAAY,QAAQ;AAChC,SAAK,OAAO,QAAQ,QAAQ;AAAA,EAC7B;AAAA;AAAA,EAGS,MACR,OAC6C;AAC7C,WAAO,IAAI,UAA2C,OAAO,KAAK,MAA8C;AAAA,EACjH;AACD;AAEO,MAAM,kBAAqE,2CAGhF;AAAA,EACD,QAA0B,wBAAU,IAAY;AAAA,EAEhD,YAAgC,KAAK,OAAO;AAAA,EAC5C,QAA4B,KAAK,OAAO;AAAA,EAExC,aAAqB;AACpB,QAAI,KAAK,cAAc,UAAa,KAAK,UAAU,QAAW;AAC7D,aAAO,QAAQ,KAAK,SAAS,KAAK,KAAK,KAAK;AAAA,IAC7C,WAAW,KAAK,cAAc,QAAW;AACxC,aAAO;AAAA,IACR,OAAO;AACN,aAAO,QAAQ,KAAK,SAAS;AAAA,IAC9B;AAAA,EACD;AACD;AAeO,SAAS,KAAK,GAA8B,IAAqB,CAAC,GAAG;AAC3E,QAAM,EAAE,MAAM,OAAO,QAAI,qCAAwC,GAAG,CAAC;AACrE,SAAO,IAAI,iBAAiB,MAAM,MAAM;AACzC;","names":[]}
|
||||||
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../src/sql-js/index.ts"],"sourcesContent":["export * from './driver.ts';\nexport * from './session.ts';\n"],"mappings":"AAAA,cAAc;AACd,cAAc;","names":[]}
|
||||||
@ -0,0 +1,82 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var common_exports = {};
|
||||||
|
__export(common_exports, {
|
||||||
|
SingleStoreColumn: () => SingleStoreColumn,
|
||||||
|
SingleStoreColumnBuilder: () => SingleStoreColumnBuilder,
|
||||||
|
SingleStoreColumnBuilderWithAutoIncrement: () => SingleStoreColumnBuilderWithAutoIncrement,
|
||||||
|
SingleStoreColumnWithAutoIncrement: () => SingleStoreColumnWithAutoIncrement
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(common_exports);
|
||||||
|
var import_column_builder = require("../../column-builder.cjs");
|
||||||
|
var import_column = require("../../column.cjs");
|
||||||
|
var import_entity = require("../../entity.cjs");
|
||||||
|
var import_unique_constraint = require("../unique-constraint.cjs");
|
||||||
|
class SingleStoreColumnBuilder extends import_column_builder.ColumnBuilder {
|
||||||
|
static [import_entity.entityKind] = "SingleStoreColumnBuilder";
|
||||||
|
unique(name) {
|
||||||
|
this.config.isUnique = true;
|
||||||
|
this.config.uniqueName = name;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
// TODO: Implement generated columns for SingleStore (https://docs.singlestore.com/cloud/create-a-database/using-persistent-computed-columns/)
|
||||||
|
/** @internal */
|
||||||
|
generatedAlwaysAs(as, config) {
|
||||||
|
this.config.generated = {
|
||||||
|
as,
|
||||||
|
type: "always",
|
||||||
|
mode: config?.mode ?? "virtual"
|
||||||
|
};
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class SingleStoreColumn extends import_column.Column {
|
||||||
|
constructor(table, config) {
|
||||||
|
if (!config.uniqueName) {
|
||||||
|
config.uniqueName = (0, import_unique_constraint.uniqueKeyName)(table, [config.name]);
|
||||||
|
}
|
||||||
|
super(table, config);
|
||||||
|
this.table = table;
|
||||||
|
}
|
||||||
|
static [import_entity.entityKind] = "SingleStoreColumn";
|
||||||
|
}
|
||||||
|
class SingleStoreColumnBuilderWithAutoIncrement extends SingleStoreColumnBuilder {
|
||||||
|
static [import_entity.entityKind] = "SingleStoreColumnBuilderWithAutoIncrement";
|
||||||
|
constructor(name, dataType, columnType) {
|
||||||
|
super(name, dataType, columnType);
|
||||||
|
this.config.autoIncrement = false;
|
||||||
|
}
|
||||||
|
autoincrement() {
|
||||||
|
this.config.autoIncrement = true;
|
||||||
|
this.config.hasDefault = true;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class SingleStoreColumnWithAutoIncrement extends SingleStoreColumn {
|
||||||
|
static [import_entity.entityKind] = "SingleStoreColumnWithAutoIncrement";
|
||||||
|
autoIncrement = this.config.autoIncrement;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
SingleStoreColumn,
|
||||||
|
SingleStoreColumnBuilder,
|
||||||
|
SingleStoreColumnBuilderWithAutoIncrement,
|
||||||
|
SingleStoreColumnWithAutoIncrement
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=common.cjs.map
|
||||||
@ -0,0 +1,17 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.first = void 0;
|
||||||
|
var EmptyError_1 = require("../util/EmptyError");
|
||||||
|
var filter_1 = require("./filter");
|
||||||
|
var take_1 = require("./take");
|
||||||
|
var defaultIfEmpty_1 = require("./defaultIfEmpty");
|
||||||
|
var throwIfEmpty_1 = require("./throwIfEmpty");
|
||||||
|
var identity_1 = require("../util/identity");
|
||||||
|
function first(predicate, defaultValue) {
|
||||||
|
var hasDefaultValue = arguments.length >= 2;
|
||||||
|
return function (source) {
|
||||||
|
return source.pipe(predicate ? filter_1.filter(function (v, i) { return predicate(v, i, source); }) : identity_1.identity, take_1.take(1), hasDefaultValue ? defaultIfEmpty_1.defaultIfEmpty(defaultValue) : throwIfEmpty_1.throwIfEmpty(function () { return new EmptyError_1.EmptyError(); }));
|
||||||
|
};
|
||||||
|
}
|
||||||
|
exports.first = first;
|
||||||
|
//# sourceMappingURL=first.js.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"bufferWhen.js","sourceRoot":"","sources":["../../../../src/internal/operators/bufferWhen.ts"],"names":[],"mappings":";;;AAEA,qCAAuC;AACvC,qCAAoC;AACpC,2DAAgE;AAChE,qDAAoD;AAwCpD,SAAgB,UAAU,CAAI,eAA2C;IACvE,OAAO,cAAO,CAAC,UAAC,MAAM,EAAE,UAAU;QAEhC,IAAI,MAAM,GAAe,IAAI,CAAC;QAI9B,IAAI,iBAAiB,GAAyB,IAAI,CAAC;QAMnD,IAAM,UAAU,GAAG;YAGjB,iBAAiB,aAAjB,iBAAiB,uBAAjB,iBAAiB,CAAE,WAAW,EAAE,CAAC;YAEjC,IAAM,CAAC,GAAG,MAAM,CAAC;YACjB,MAAM,GAAG,EAAE,CAAC;YACZ,CAAC,IAAI,UAAU,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;YAGxB,qBAAS,CAAC,eAAe,EAAE,CAAC,CAAC,SAAS,CAAC,CAAC,iBAAiB,GAAG,6CAAwB,CAAC,UAAU,EAAE,UAAU,EAAE,WAAI,CAAC,CAAC,CAAC,CAAC;QACvH,CAAC,CAAC;QAGF,UAAU,EAAE,CAAC;QAGb,MAAM,CAAC,SAAS,CACd,6CAAwB,CACtB,UAAU,EAEV,UAAC,KAAK,IAAK,OAAA,MAAM,aAAN,MAAM,uBAAN,MAAM,CAAE,IAAI,CAAC,KAAK,CAAC,EAAnB,CAAmB,EAG9B;YACE,MAAM,IAAI,UAAU,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;YAClC,UAAU,CAAC,QAAQ,EAAE,CAAC;QACxB,CAAC,EAED,SAAS,EAET,cAAM,OAAA,CAAC,MAAM,GAAG,iBAAiB,GAAG,IAAK,CAAC,EAApC,CAAoC,CAC3C,CACF,CAAC;IACJ,CAAC,CAAC,CAAC;AACL,CAAC;AAhDD,gCAgDC"}
|
||||||
@ -0,0 +1,3 @@
|
|||||||
|
import { SchedulerLike } from '../types';
|
||||||
|
export declare function schedulePromise<T>(input: PromiseLike<T>, scheduler: SchedulerLike): import("../Observable").Observable<T>;
|
||||||
|
//# sourceMappingURL=schedulePromise.d.ts.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"repeat.js","sourceRoot":"","sources":["../../../../src/internal/operators/repeat.ts"],"names":[],"mappings":"AACA,OAAO,EAAE,KAAK,EAAE,MAAM,qBAAqB,CAAC;AAC5C,OAAO,EAAE,OAAO,EAAE,MAAM,cAAc,CAAC;AAEvC,OAAO,EAAE,wBAAwB,EAAE,MAAM,sBAAsB,CAAC;AAChE,OAAO,EAAE,SAAS,EAAE,MAAM,yBAAyB,CAAC;AACpD,OAAO,EAAE,KAAK,EAAE,MAAM,qBAAqB,CAAC;AA6G5C,MAAM,UAAU,MAAM,CAAI,aAAqC;IAC7D,IAAI,KAAK,GAAG,QAAQ,CAAC;IACrB,IAAI,KAA4B,CAAC;IAEjC,IAAI,aAAa,IAAI,IAAI,EAAE;QACzB,IAAI,OAAO,aAAa,KAAK,QAAQ,EAAE;YACrC,CAAC,EAAE,KAAK,GAAG,QAAQ,EAAE,KAAK,EAAE,GAAG,aAAa,CAAC,CAAC;SAC/C;aAAM;YACL,KAAK,GAAG,aAAa,CAAC;SACvB;KACF;IAED,OAAO,KAAK,IAAI,CAAC;QACf,CAAC,CAAC,GAAG,EAAE,CAAC,KAAK;QACb,CAAC,CAAC,OAAO,CAAC,CAAC,MAAM,EAAE,UAAU,EAAE,EAAE;YAC7B,IAAI,KAAK,GAAG,CAAC,CAAC;YACd,IAAI,SAA8B,CAAC;YAEnC,MAAM,WAAW,GAAG,GAAG,EAAE;gBACvB,SAAS,aAAT,SAAS,uBAAT,SAAS,CAAE,WAAW,EAAE,CAAC;gBACzB,SAAS,GAAG,IAAI,CAAC;gBACjB,IAAI,KAAK,IAAI,IAAI,EAAE;oBACjB,MAAM,QAAQ,GAAG,OAAO,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CAAC;oBACpF,MAAM,kBAAkB,GAAG,wBAAwB,CAAC,UAAU,EAAE,GAAG,EAAE;wBACnE,kBAAkB,CAAC,WAAW,EAAE,CAAC;wBACjC,iBAAiB,EAAE,CAAC;oBACtB,CAAC,CAAC,CAAC;oBACH,QAAQ,CAAC,SAAS,CAAC,kBAAkB,CAAC,CAAC;iBACxC;qBAAM;oBACL,iBAAiB,EAAE,CAAC;iBACrB;YACH,CAAC,CAAC;YAEF,MAAM,iBAAiB,GAAG,GAAG,EAAE;gBAC7B,IAAI,SAAS,GAAG,KAAK,CAAC;gBACtB,SAAS,GAAG,MAAM,CAAC,SAAS,CAC1B,wBAAwB,CAAC,UAAU,EAAE,SAAS,EAAE,GAAG,EAAE;oBACnD,IAAI,EAAE,KAAK,GAAG,KAAK,EAAE;wBACnB,IAAI,SAAS,EAAE;4BACb,WAAW,EAAE,CAAC;yBACf;6BAAM;4BACL,SAAS,GAAG,IAAI,CAAC;yBAClB;qBACF;yBAAM;wBACL,UAAU,CAAC,QAAQ,EAAE,CAAC;qBACvB;gBACH,CAAC,CAAC,CACH,CAAC;gBAEF,IAAI,SAAS,EAAE;oBACb,WAAW,EAAE,CAAC;iBACf;YACH,CAAC,CAAC;YAEF,iBAAiB,EAAE,CAAC;QACtB,CAAC,CAAC,CAAC;AACT,CAAC"}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
"use strict";var i=Object.defineProperty;var a=(r,t)=>i(r,"name",{value:t,configurable:!0});var n=require("node:repl"),u=require("esbuild");const f=a(r=>{const{eval:t}=r,c=a(async function(e,l,s,o){try{e=(await u.transform(e,{sourcefile:s,loader:"ts",tsconfigRaw:{compilerOptions:{preserveValueImports:!0}},define:{require:"global.require"}})).code}catch{}return t.call(this,e,l,s,o)},"preEval");r.eval=c},"patchEval"),{start:p}=n;n.start=function(){const r=Reflect.apply(p,this,arguments);return f(r),r};
|
||||||
@ -0,0 +1,3 @@
|
|||||||
|
import { mergeMap } from './mergeMap';
|
||||||
|
export var flatMap = mergeMap;
|
||||||
|
//# sourceMappingURL=flatMap.js.map
|
||||||
@ -0,0 +1,13 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -e
|
||||||
|
set -x
|
||||||
|
|
||||||
|
rm -rf "/Applications/Google Chrome Beta.app"
|
||||||
|
cd /tmp
|
||||||
|
curl --retry 3 -o ./googlechromebeta.dmg https://dl.google.com/chrome/mac/universal/beta/googlechromebeta.dmg
|
||||||
|
hdiutil attach -nobrowse -quiet -noautofsck -noautoopen -mountpoint /Volumes/googlechromebeta.dmg ./googlechromebeta.dmg
|
||||||
|
cp -pR "/Volumes/googlechromebeta.dmg/Google Chrome Beta.app" /Applications
|
||||||
|
hdiutil detach /Volumes/googlechromebeta.dmg
|
||||||
|
rm -rf /tmp/googlechromebeta.dmg
|
||||||
|
|
||||||
|
/Applications/Google\ Chrome\ Beta.app/Contents/MacOS/Google\ Chrome\ Beta --version
|
||||||
@ -0,0 +1,68 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var driver_exports = {};
|
||||||
|
__export(driver_exports, {
|
||||||
|
OPSQLiteDatabase: () => OPSQLiteDatabase,
|
||||||
|
drizzle: () => drizzle
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(driver_exports);
|
||||||
|
var import_entity = require("../entity.cjs");
|
||||||
|
var import_logger = require("../logger.cjs");
|
||||||
|
var import_relations = require("../relations.cjs");
|
||||||
|
var import_db = require("../sqlite-core/db.cjs");
|
||||||
|
var import_dialect = require("../sqlite-core/dialect.cjs");
|
||||||
|
var import_session = require("./session.cjs");
|
||||||
|
class OPSQLiteDatabase extends import_db.BaseSQLiteDatabase {
|
||||||
|
static [import_entity.entityKind] = "OPSQLiteDatabase";
|
||||||
|
}
|
||||||
|
function drizzle(client, config = {}) {
|
||||||
|
const dialect = new import_dialect.SQLiteAsyncDialect({ casing: config.casing });
|
||||||
|
let logger;
|
||||||
|
if (config.logger === true) {
|
||||||
|
logger = new import_logger.DefaultLogger();
|
||||||
|
} else if (config.logger !== false) {
|
||||||
|
logger = config.logger;
|
||||||
|
}
|
||||||
|
let schema;
|
||||||
|
if (config.schema) {
|
||||||
|
const tablesConfig = (0, import_relations.extractTablesRelationalConfig)(
|
||||||
|
config.schema,
|
||||||
|
import_relations.createTableRelationsHelpers
|
||||||
|
);
|
||||||
|
schema = {
|
||||||
|
fullSchema: config.schema,
|
||||||
|
schema: tablesConfig.tables,
|
||||||
|
tableNamesMap: tablesConfig.tableNamesMap
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const session = new import_session.OPSQLiteSession(client, dialect, schema, { logger, cache: config.cache });
|
||||||
|
const db = new OPSQLiteDatabase("async", dialect, session, schema);
|
||||||
|
db.$client = client;
|
||||||
|
db.$cache = config.cache;
|
||||||
|
if (db.$cache) {
|
||||||
|
db.$cache["invalidate"] = config.cache?.onMutate;
|
||||||
|
}
|
||||||
|
return db;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
OPSQLiteDatabase,
|
||||||
|
drizzle
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=driver.cjs.map
|
||||||
@ -0,0 +1,78 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var debugControllerDispatcher_exports = {};
|
||||||
|
__export(debugControllerDispatcher_exports, {
|
||||||
|
DebugControllerDispatcher: () => DebugControllerDispatcher
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(debugControllerDispatcher_exports);
|
||||||
|
var import_utils = require("../../utils");
|
||||||
|
var import_debugController = require("../debugController");
|
||||||
|
var import_dispatcher = require("./dispatcher");
|
||||||
|
class DebugControllerDispatcher extends import_dispatcher.Dispatcher {
|
||||||
|
constructor(connection, debugController) {
|
||||||
|
super(connection, debugController, "DebugController", {});
|
||||||
|
this._type_DebugController = true;
|
||||||
|
this._listeners = [
|
||||||
|
import_utils.eventsHelper.addEventListener(this._object, import_debugController.DebugController.Events.StateChanged, (params) => {
|
||||||
|
this._dispatchEvent("stateChanged", params);
|
||||||
|
}),
|
||||||
|
import_utils.eventsHelper.addEventListener(this._object, import_debugController.DebugController.Events.InspectRequested, ({ selector, locator, ariaSnapshot }) => {
|
||||||
|
this._dispatchEvent("inspectRequested", { selector, locator, ariaSnapshot });
|
||||||
|
}),
|
||||||
|
import_utils.eventsHelper.addEventListener(this._object, import_debugController.DebugController.Events.SourceChanged, ({ text, header, footer, actions }) => {
|
||||||
|
this._dispatchEvent("sourceChanged", { text, header, footer, actions });
|
||||||
|
}),
|
||||||
|
import_utils.eventsHelper.addEventListener(this._object, import_debugController.DebugController.Events.Paused, ({ paused }) => {
|
||||||
|
this._dispatchEvent("paused", { paused });
|
||||||
|
}),
|
||||||
|
import_utils.eventsHelper.addEventListener(this._object, import_debugController.DebugController.Events.SetModeRequested, ({ mode }) => {
|
||||||
|
this._dispatchEvent("setModeRequested", { mode });
|
||||||
|
})
|
||||||
|
];
|
||||||
|
}
|
||||||
|
async initialize(params, progress) {
|
||||||
|
this._object.initialize(params.codegenId, params.sdkLanguage);
|
||||||
|
}
|
||||||
|
async setReportStateChanged(params, progress) {
|
||||||
|
this._object.setReportStateChanged(params.enabled);
|
||||||
|
}
|
||||||
|
async setRecorderMode(params, progress) {
|
||||||
|
await this._object.setRecorderMode(progress, params);
|
||||||
|
}
|
||||||
|
async highlight(params, progress) {
|
||||||
|
await this._object.highlight(progress, params);
|
||||||
|
}
|
||||||
|
async hideHighlight(params, progress) {
|
||||||
|
await this._object.hideHighlight(progress);
|
||||||
|
}
|
||||||
|
async resume(params, progress) {
|
||||||
|
await this._object.resume(progress);
|
||||||
|
}
|
||||||
|
async kill(params, progress) {
|
||||||
|
this._object.kill();
|
||||||
|
}
|
||||||
|
_onDispose() {
|
||||||
|
import_utils.eventsHelper.removeEventListeners(this._listeners);
|
||||||
|
this._object.dispose();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
DebugControllerDispatcher
|
||||||
|
});
|
||||||
File diff suppressed because one or more lines are too long
@ -0,0 +1,54 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { UnaryFunction } from '../types';
|
||||||
|
/**
|
||||||
|
* Splits the source Observable into two, one with values that satisfy a
|
||||||
|
* predicate, and another with values that don't satisfy the predicate.
|
||||||
|
*
|
||||||
|
* <span class="informal">It's like {@link filter}, but returns two Observables:
|
||||||
|
* one like the output of {@link filter}, and the other with values that did not
|
||||||
|
* pass the condition.</span>
|
||||||
|
*
|
||||||
|
* 
|
||||||
|
*
|
||||||
|
* `partition` outputs an array with two Observables that partition the values
|
||||||
|
* from the source Observable through the given `predicate` function. The first
|
||||||
|
* Observable in that array emits source values for which the predicate argument
|
||||||
|
* returns true. The second Observable emits source values for which the
|
||||||
|
* predicate returns false. The first behaves like {@link filter} and the second
|
||||||
|
* behaves like {@link filter} with the predicate negated.
|
||||||
|
*
|
||||||
|
* ## Example
|
||||||
|
*
|
||||||
|
* Partition click events into those on DIV elements and those elsewhere
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* import { fromEvent } from 'rxjs';
|
||||||
|
* import { partition } from 'rxjs/operators';
|
||||||
|
*
|
||||||
|
* const div = document.createElement('div');
|
||||||
|
* div.style.cssText = 'width: 200px; height: 200px; background: #09c;';
|
||||||
|
* document.body.appendChild(div);
|
||||||
|
*
|
||||||
|
* const clicks = fromEvent(document, 'click');
|
||||||
|
* const [clicksOnDivs, clicksElsewhere] = clicks.pipe(partition(ev => (<HTMLElement>ev.target).tagName === 'DIV'));
|
||||||
|
*
|
||||||
|
* clicksOnDivs.subscribe(x => console.log('DIV clicked: ', x));
|
||||||
|
* clicksElsewhere.subscribe(x => console.log('Other clicked: ', x));
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* @see {@link filter}
|
||||||
|
*
|
||||||
|
* @param predicate A function that evaluates each value emitted by the source
|
||||||
|
* Observable. If it returns `true`, the value is emitted on the first Observable
|
||||||
|
* in the returned array, if `false` the value is emitted on the second Observable
|
||||||
|
* in the array. The `index` parameter is the number `i` for the i-th source
|
||||||
|
* emission that has happened since the subscription, starting from the number `0`.
|
||||||
|
* @param thisArg An optional argument to determine the value of `this` in the
|
||||||
|
* `predicate` function.
|
||||||
|
* @return A function that returns an array with two Observables: one with
|
||||||
|
* values that passed the predicate, and another with values that did not pass
|
||||||
|
* the predicate.
|
||||||
|
* @deprecated Replaced with the {@link partition} static creation function. Will be removed in v8.
|
||||||
|
*/
|
||||||
|
export declare function partition<T>(predicate: (value: T, index: number) => boolean, thisArg?: any): UnaryFunction<Observable<T>, [Observable<T>, Observable<T>]>;
|
||||||
|
//# sourceMappingURL=partition.d.ts.map
|
||||||
@ -0,0 +1,35 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __reExport = (target, mod, secondTarget) => (__copyProps(target, mod, "default"), secondTarget && __copyProps(secondTarget, mod, "default"));
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var query_builders_exports = {};
|
||||||
|
module.exports = __toCommonJS(query_builders_exports);
|
||||||
|
__reExport(query_builders_exports, require("./delete.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./insert.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./query-builder.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./refresh-materialized-view.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./select.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./select.types.cjs"), module.exports);
|
||||||
|
__reExport(query_builders_exports, require("./update.cjs"), module.exports);
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
...require("./delete.cjs"),
|
||||||
|
...require("./insert.cjs"),
|
||||||
|
...require("./query-builder.cjs"),
|
||||||
|
...require("./refresh-materialized-view.cjs"),
|
||||||
|
...require("./select.cjs"),
|
||||||
|
...require("./select.types.cjs"),
|
||||||
|
...require("./update.cjs")
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=index.cjs.map
|
||||||
@ -0,0 +1,27 @@
|
|||||||
|
import { EmptyError } from './util/EmptyError';
|
||||||
|
export function lastValueFrom(source, config) {
|
||||||
|
var hasConfig = typeof config === 'object';
|
||||||
|
return new Promise(function (resolve, reject) {
|
||||||
|
var _hasValue = false;
|
||||||
|
var _value;
|
||||||
|
source.subscribe({
|
||||||
|
next: function (value) {
|
||||||
|
_value = value;
|
||||||
|
_hasValue = true;
|
||||||
|
},
|
||||||
|
error: reject,
|
||||||
|
complete: function () {
|
||||||
|
if (_hasValue) {
|
||||||
|
resolve(_value);
|
||||||
|
}
|
||||||
|
else if (hasConfig) {
|
||||||
|
resolve(config.defaultValue);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
reject(new EmptyError());
|
||||||
|
}
|
||||||
|
},
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
//# sourceMappingURL=lastValueFrom.js.map
|
||||||
@ -0,0 +1,10 @@
|
|||||||
|
import type { SQLiteDatabase, SQLiteRunResult } from 'expo-sqlite';
|
||||||
|
import { entityKind } from "../entity.cjs";
|
||||||
|
import { BaseSQLiteDatabase } from "../sqlite-core/db.cjs";
|
||||||
|
import type { DrizzleConfig } from "../utils.cjs";
|
||||||
|
export declare class ExpoSQLiteDatabase<TSchema extends Record<string, unknown> = Record<string, never>> extends BaseSQLiteDatabase<'sync', SQLiteRunResult, TSchema> {
|
||||||
|
static readonly [entityKind]: string;
|
||||||
|
}
|
||||||
|
export declare function drizzle<TSchema extends Record<string, unknown> = Record<string, never>>(client: SQLiteDatabase, config?: DrizzleConfig<TSchema>): ExpoSQLiteDatabase<TSchema> & {
|
||||||
|
$client: SQLiteDatabase;
|
||||||
|
};
|
||||||
@ -0,0 +1,106 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var array_exports = {};
|
||||||
|
__export(array_exports, {
|
||||||
|
makePgArray: () => makePgArray,
|
||||||
|
parsePgArray: () => parsePgArray,
|
||||||
|
parsePgNestedArray: () => parsePgNestedArray
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(array_exports);
|
||||||
|
function parsePgArrayValue(arrayString, startFrom, inQuotes) {
|
||||||
|
for (let i = startFrom; i < arrayString.length; i++) {
|
||||||
|
const char = arrayString[i];
|
||||||
|
if (char === "\\") {
|
||||||
|
i++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (char === '"') {
|
||||||
|
return [arrayString.slice(startFrom, i).replace(/\\/g, ""), i + 1];
|
||||||
|
}
|
||||||
|
if (inQuotes) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (char === "," || char === "}") {
|
||||||
|
return [arrayString.slice(startFrom, i).replace(/\\/g, ""), i];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [arrayString.slice(startFrom).replace(/\\/g, ""), arrayString.length];
|
||||||
|
}
|
||||||
|
function parsePgNestedArray(arrayString, startFrom = 0) {
|
||||||
|
const result = [];
|
||||||
|
let i = startFrom;
|
||||||
|
let lastCharIsComma = false;
|
||||||
|
while (i < arrayString.length) {
|
||||||
|
const char = arrayString[i];
|
||||||
|
if (char === ",") {
|
||||||
|
if (lastCharIsComma || i === startFrom) {
|
||||||
|
result.push("");
|
||||||
|
}
|
||||||
|
lastCharIsComma = true;
|
||||||
|
i++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
lastCharIsComma = false;
|
||||||
|
if (char === "\\") {
|
||||||
|
i += 2;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (char === '"') {
|
||||||
|
const [value2, startFrom2] = parsePgArrayValue(arrayString, i + 1, true);
|
||||||
|
result.push(value2);
|
||||||
|
i = startFrom2;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (char === "}") {
|
||||||
|
return [result, i + 1];
|
||||||
|
}
|
||||||
|
if (char === "{") {
|
||||||
|
const [value2, startFrom2] = parsePgNestedArray(arrayString, i + 1);
|
||||||
|
result.push(value2);
|
||||||
|
i = startFrom2;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const [value, newStartFrom] = parsePgArrayValue(arrayString, i, false);
|
||||||
|
result.push(value);
|
||||||
|
i = newStartFrom;
|
||||||
|
}
|
||||||
|
return [result, i];
|
||||||
|
}
|
||||||
|
function parsePgArray(arrayString) {
|
||||||
|
const [result] = parsePgNestedArray(arrayString, 1);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
function makePgArray(array) {
|
||||||
|
return `{${array.map((item) => {
|
||||||
|
if (Array.isArray(item)) {
|
||||||
|
return makePgArray(item);
|
||||||
|
}
|
||||||
|
if (typeof item === "string") {
|
||||||
|
return `"${item.replace(/\\/g, "\\\\").replace(/"/g, '\\"')}"`;
|
||||||
|
}
|
||||||
|
return `${item}`;
|
||||||
|
}).join(",")}}`;
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
makePgArray,
|
||||||
|
parsePgArray,
|
||||||
|
parsePgNestedArray
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=array.cjs.map
|
||||||
@ -0,0 +1,195 @@
|
|||||||
|
"use strict";
|
||||||
|
var __defProp = Object.defineProperty;
|
||||||
|
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||||
|
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||||
|
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||||
|
var __export = (target, all) => {
|
||||||
|
for (var name in all)
|
||||||
|
__defProp(target, name, { get: all[name], enumerable: true });
|
||||||
|
};
|
||||||
|
var __copyProps = (to, from, except, desc) => {
|
||||||
|
if (from && typeof from === "object" || typeof from === "function") {
|
||||||
|
for (let key of __getOwnPropNames(from))
|
||||||
|
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||||
|
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||||
|
}
|
||||||
|
return to;
|
||||||
|
};
|
||||||
|
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||||
|
var cache_exports = {};
|
||||||
|
__export(cache_exports, {
|
||||||
|
UpstashCache: () => UpstashCache,
|
||||||
|
upstashCache: () => upstashCache
|
||||||
|
});
|
||||||
|
module.exports = __toCommonJS(cache_exports);
|
||||||
|
var import_redis = require("@upstash/redis");
|
||||||
|
var import_core = require("../core/index.cjs");
|
||||||
|
var import_entity = require("../../entity.cjs");
|
||||||
|
var import__ = require("../../index.cjs");
|
||||||
|
const getByTagScript = `
|
||||||
|
local tagsMapKey = KEYS[1] -- tags map key
|
||||||
|
local tag = ARGV[1] -- tag
|
||||||
|
|
||||||
|
local compositeTableName = redis.call('HGET', tagsMapKey, tag)
|
||||||
|
if not compositeTableName then
|
||||||
|
return nil
|
||||||
|
end
|
||||||
|
|
||||||
|
local value = redis.call('HGET', compositeTableName, tag)
|
||||||
|
return value
|
||||||
|
`;
|
||||||
|
const onMutateScript = `
|
||||||
|
local tagsMapKey = KEYS[1] -- tags map key
|
||||||
|
local tables = {} -- initialize tables array
|
||||||
|
local tags = ARGV -- tags array
|
||||||
|
|
||||||
|
for i = 2, #KEYS do
|
||||||
|
tables[#tables + 1] = KEYS[i] -- add all keys except the first one to tables
|
||||||
|
end
|
||||||
|
|
||||||
|
if #tags > 0 then
|
||||||
|
for _, tag in ipairs(tags) do
|
||||||
|
if tag ~= nil and tag ~= '' then
|
||||||
|
local compositeTableName = redis.call('HGET', tagsMapKey, tag)
|
||||||
|
if compositeTableName then
|
||||||
|
redis.call('HDEL', compositeTableName, tag)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
redis.call('HDEL', tagsMapKey, unpack(tags))
|
||||||
|
end
|
||||||
|
|
||||||
|
local keysToDelete = {}
|
||||||
|
|
||||||
|
if #tables > 0 then
|
||||||
|
local compositeTableNames = redis.call('SUNION', unpack(tables))
|
||||||
|
for _, compositeTableName in ipairs(compositeTableNames) do
|
||||||
|
keysToDelete[#keysToDelete + 1] = compositeTableName
|
||||||
|
end
|
||||||
|
for _, table in ipairs(tables) do
|
||||||
|
keysToDelete[#keysToDelete + 1] = table
|
||||||
|
end
|
||||||
|
redis.call('DEL', unpack(keysToDelete))
|
||||||
|
end
|
||||||
|
`;
|
||||||
|
class UpstashCache extends import_core.Cache {
|
||||||
|
constructor(redis, config, useGlobally) {
|
||||||
|
super();
|
||||||
|
this.redis = redis;
|
||||||
|
this.useGlobally = useGlobally;
|
||||||
|
this.internalConfig = this.toInternalConfig(config);
|
||||||
|
this.luaScripts = {
|
||||||
|
getByTagScript: this.redis.createScript(getByTagScript, { readonly: true }),
|
||||||
|
onMutateScript: this.redis.createScript(onMutateScript)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
static [import_entity.entityKind] = "UpstashCache";
|
||||||
|
/**
|
||||||
|
* Prefix for sets which denote the composite table names for each unique table
|
||||||
|
*
|
||||||
|
* Example: In the composite table set of "table1", you may find
|
||||||
|
* `${compositeTablePrefix}table1,table2` and `${compositeTablePrefix}table1,table3`
|
||||||
|
*/
|
||||||
|
static compositeTableSetPrefix = "__CTS__";
|
||||||
|
/**
|
||||||
|
* Prefix for hashes which map hash or tags to cache values
|
||||||
|
*/
|
||||||
|
static compositeTablePrefix = "__CT__";
|
||||||
|
/**
|
||||||
|
* Key which holds the mapping of tags to composite table names
|
||||||
|
*
|
||||||
|
* Using this tagsMapKey, you can find the composite table name for a given tag
|
||||||
|
* and get the cache value for that tag:
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* const compositeTable = redis.hget(tagsMapKey, 'tag1')
|
||||||
|
* console.log(compositeTable) // `${compositeTablePrefix}table1,table2`
|
||||||
|
*
|
||||||
|
* const cachevalue = redis.hget(compositeTable, 'tag1')
|
||||||
|
*/
|
||||||
|
static tagsMapKey = "__tagsMap__";
|
||||||
|
/**
|
||||||
|
* Queries whose auto invalidation is false aren't stored in their respective
|
||||||
|
* composite table hashes because those hashes are deleted when a mutation
|
||||||
|
* occurs on related tables.
|
||||||
|
*
|
||||||
|
* Instead, they are stored in a separate hash with the prefix
|
||||||
|
* `__nonAutoInvalidate__` to prevent them from being deleted when a mutation
|
||||||
|
*/
|
||||||
|
static nonAutoInvalidateTablePrefix = "__nonAutoInvalidate__";
|
||||||
|
luaScripts;
|
||||||
|
internalConfig;
|
||||||
|
strategy() {
|
||||||
|
return this.useGlobally ? "all" : "explicit";
|
||||||
|
}
|
||||||
|
toInternalConfig(config) {
|
||||||
|
return config ? {
|
||||||
|
seconds: config.ex,
|
||||||
|
hexOptions: config.hexOptions
|
||||||
|
} : {
|
||||||
|
seconds: 1
|
||||||
|
};
|
||||||
|
}
|
||||||
|
async get(key, tables, isTag = false, isAutoInvalidate) {
|
||||||
|
if (!isAutoInvalidate) {
|
||||||
|
const result2 = await this.redis.hget(UpstashCache.nonAutoInvalidateTablePrefix, key);
|
||||||
|
return result2 === null ? void 0 : result2;
|
||||||
|
}
|
||||||
|
if (isTag) {
|
||||||
|
const result2 = await this.luaScripts.getByTagScript.exec([UpstashCache.tagsMapKey], [key]);
|
||||||
|
return result2 === null ? void 0 : result2;
|
||||||
|
}
|
||||||
|
const compositeKey = this.getCompositeKey(tables);
|
||||||
|
const result = (await this.redis.hget(compositeKey, key)) ?? void 0;
|
||||||
|
return result === null ? void 0 : result;
|
||||||
|
}
|
||||||
|
async put(key, response, tables, isTag = false, config) {
|
||||||
|
const isAutoInvalidate = tables.length !== 0;
|
||||||
|
const pipeline = this.redis.pipeline();
|
||||||
|
const ttlSeconds = config && config.ex ? config.ex : this.internalConfig.seconds;
|
||||||
|
const hexOptions = config && config.hexOptions ? config.hexOptions : this.internalConfig?.hexOptions;
|
||||||
|
if (!isAutoInvalidate) {
|
||||||
|
if (isTag) {
|
||||||
|
pipeline.hset(UpstashCache.tagsMapKey, { [key]: UpstashCache.nonAutoInvalidateTablePrefix });
|
||||||
|
pipeline.hexpire(UpstashCache.tagsMapKey, key, ttlSeconds, hexOptions);
|
||||||
|
}
|
||||||
|
pipeline.hset(UpstashCache.nonAutoInvalidateTablePrefix, { [key]: response });
|
||||||
|
pipeline.hexpire(UpstashCache.nonAutoInvalidateTablePrefix, key, ttlSeconds, hexOptions);
|
||||||
|
await pipeline.exec();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const compositeKey = this.getCompositeKey(tables);
|
||||||
|
pipeline.hset(compositeKey, { [key]: response });
|
||||||
|
pipeline.hexpire(compositeKey, key, ttlSeconds, hexOptions);
|
||||||
|
if (isTag) {
|
||||||
|
pipeline.hset(UpstashCache.tagsMapKey, { [key]: compositeKey });
|
||||||
|
pipeline.hexpire(UpstashCache.tagsMapKey, key, ttlSeconds, hexOptions);
|
||||||
|
}
|
||||||
|
for (const table of tables) {
|
||||||
|
pipeline.sadd(this.addTablePrefix(table), compositeKey);
|
||||||
|
}
|
||||||
|
await pipeline.exec();
|
||||||
|
}
|
||||||
|
async onMutate(params) {
|
||||||
|
const tags = Array.isArray(params.tags) ? params.tags : params.tags ? [params.tags] : [];
|
||||||
|
const tables = Array.isArray(params.tables) ? params.tables : params.tables ? [params.tables] : [];
|
||||||
|
const tableNames = tables.map((table) => (0, import_entity.is)(table, import__.Table) ? table[import__.OriginalName] : table);
|
||||||
|
const compositeTableSets = tableNames.map((table) => this.addTablePrefix(table));
|
||||||
|
await this.luaScripts.onMutateScript.exec([UpstashCache.tagsMapKey, ...compositeTableSets], tags);
|
||||||
|
}
|
||||||
|
addTablePrefix = (table) => `${UpstashCache.compositeTableSetPrefix}${table}`;
|
||||||
|
getCompositeKey = (tables) => `${UpstashCache.compositeTablePrefix}${tables.sort().join(",")}`;
|
||||||
|
}
|
||||||
|
function upstashCache({ url, token, config, global = false }) {
|
||||||
|
const redis = new import_redis.Redis({
|
||||||
|
url,
|
||||||
|
token
|
||||||
|
});
|
||||||
|
return new UpstashCache(redis, config, global);
|
||||||
|
}
|
||||||
|
// Annotate the CommonJS export names for ESM import in node:
|
||||||
|
0 && (module.exports = {
|
||||||
|
UpstashCache,
|
||||||
|
upstashCache
|
||||||
|
});
|
||||||
|
//# sourceMappingURL=cache.cjs.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"onErrorResumeNext.d.ts","sourceRoot":"","sources":["../../../../src/internal/observable/onErrorResumeNext.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,UAAU,EAAE,MAAM,eAAe,CAAC;AAC3C,OAAO,EAAE,oBAAoB,EAAE,MAAM,UAAU,CAAC;AAMhD,wBAAgB,iBAAiB,CAAC,CAAC,SAAS,SAAS,OAAO,EAAE,EAAE,OAAO,EAAE,CAAC,GAAG,oBAAoB,CAAC,CAAC,CAAC,CAAC,GAAG,UAAU,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC;AAC9H,wBAAgB,iBAAiB,CAAC,CAAC,SAAS,SAAS,OAAO,EAAE,EAAE,GAAG,OAAO,EAAE,CAAC,GAAG,oBAAoB,CAAC,CAAC,CAAC,CAAC,GAAG,UAAU,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC"}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"exhaustMap.d.ts","sourceRoot":"","sources":["../../../../src/internal/operators/exhaustMap.ts"],"names":[],"mappings":"AAEA,OAAO,EAAE,eAAe,EAAE,gBAAgB,EAAE,eAAe,EAAE,MAAM,UAAU,CAAC;AAO9E,wBAAgB,UAAU,CAAC,CAAC,EAAE,CAAC,SAAS,eAAe,CAAC,GAAG,CAAC,EAC1D,OAAO,EAAE,CAAC,KAAK,EAAE,CAAC,EAAE,KAAK,EAAE,MAAM,KAAK,CAAC,GACtC,gBAAgB,CAAC,CAAC,EAAE,eAAe,CAAC,CAAC,CAAC,CAAC,CAAC;AAC3C,0JAA0J;AAC1J,wBAAgB,UAAU,CAAC,CAAC,EAAE,CAAC,SAAS,eAAe,CAAC,GAAG,CAAC,EAC1D,OAAO,EAAE,CAAC,KAAK,EAAE,CAAC,EAAE,KAAK,EAAE,MAAM,KAAK,CAAC,EACvC,cAAc,EAAE,SAAS,GACxB,gBAAgB,CAAC,CAAC,EAAE,eAAe,CAAC,CAAC,CAAC,CAAC,CAAC;AAC3C,0JAA0J;AAC1J,wBAAgB,UAAU,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,EAChC,OAAO,EAAE,CAAC,KAAK,EAAE,CAAC,EAAE,KAAK,EAAE,MAAM,KAAK,eAAe,CAAC,CAAC,CAAC,EACxD,cAAc,EAAE,CAAC,UAAU,EAAE,CAAC,EAAE,UAAU,EAAE,CAAC,EAAE,UAAU,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,KAAK,CAAC,GAC1F,gBAAgB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC"}
|
||||||
@ -0,0 +1,121 @@
|
|||||||
|
import { Observable } from '../Observable';
|
||||||
|
import { EmptyError } from '../util/EmptyError';
|
||||||
|
|
||||||
|
import { MonoTypeOperatorFunction, OperatorFunction, TruthyTypesOf } from '../types';
|
||||||
|
import { SequenceError } from '../util/SequenceError';
|
||||||
|
import { NotFoundError } from '../util/NotFoundError';
|
||||||
|
import { operate } from '../util/lift';
|
||||||
|
import { createOperatorSubscriber } from './OperatorSubscriber';
|
||||||
|
|
||||||
|
export function single<T>(predicate: BooleanConstructor): OperatorFunction<T, TruthyTypesOf<T>>;
|
||||||
|
export function single<T>(predicate?: (value: T, index: number, source: Observable<T>) => boolean): MonoTypeOperatorFunction<T>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns an observable that asserts that only one value is
|
||||||
|
* emitted from the observable that matches the predicate. If no
|
||||||
|
* predicate is provided, then it will assert that the observable
|
||||||
|
* only emits one value.
|
||||||
|
*
|
||||||
|
* If the source Observable did not emit `next` before completion, it
|
||||||
|
* will emit an {@link EmptyError} to the Observer's `error` callback.
|
||||||
|
*
|
||||||
|
* In the event that two values are found that match the predicate,
|
||||||
|
* or when there are two values emitted and no predicate, it will
|
||||||
|
* emit a {@link SequenceError} to the Observer's `error` callback.
|
||||||
|
*
|
||||||
|
* In the event that no values match the predicate, if one is provided,
|
||||||
|
* it will emit a {@link NotFoundError} to the Observer's `error` callback.
|
||||||
|
*
|
||||||
|
* ## Example
|
||||||
|
*
|
||||||
|
* Expect only `name` beginning with `'B'`
|
||||||
|
*
|
||||||
|
* ```ts
|
||||||
|
* import { of, single } from 'rxjs';
|
||||||
|
*
|
||||||
|
* const source1 = of(
|
||||||
|
* { name: 'Ben' },
|
||||||
|
* { name: 'Tracy' },
|
||||||
|
* { name: 'Laney' },
|
||||||
|
* { name: 'Lily' }
|
||||||
|
* );
|
||||||
|
*
|
||||||
|
* source1
|
||||||
|
* .pipe(single(x => x.name.startsWith('B')))
|
||||||
|
* .subscribe(x => console.log(x));
|
||||||
|
* // Emits 'Ben'
|
||||||
|
*
|
||||||
|
*
|
||||||
|
* const source2 = of(
|
||||||
|
* { name: 'Ben' },
|
||||||
|
* { name: 'Tracy' },
|
||||||
|
* { name: 'Bradley' },
|
||||||
|
* { name: 'Lincoln' }
|
||||||
|
* );
|
||||||
|
*
|
||||||
|
* source2
|
||||||
|
* .pipe(single(x => x.name.startsWith('B')))
|
||||||
|
* .subscribe({ error: err => console.error(err) });
|
||||||
|
* // Error emitted: SequenceError('Too many values match')
|
||||||
|
*
|
||||||
|
*
|
||||||
|
* const source3 = of(
|
||||||
|
* { name: 'Laney' },
|
||||||
|
* { name: 'Tracy' },
|
||||||
|
* { name: 'Lily' },
|
||||||
|
* { name: 'Lincoln' }
|
||||||
|
* );
|
||||||
|
*
|
||||||
|
* source3
|
||||||
|
* .pipe(single(x => x.name.startsWith('B')))
|
||||||
|
* .subscribe({ error: err => console.error(err) });
|
||||||
|
* // Error emitted: NotFoundError('No values match')
|
||||||
|
* ```
|
||||||
|
*
|
||||||
|
* @see {@link first}
|
||||||
|
* @see {@link find}
|
||||||
|
* @see {@link findIndex}
|
||||||
|
* @see {@link elementAt}
|
||||||
|
*
|
||||||
|
* @throws {NotFoundError} Delivers a `NotFoundError` to the Observer's `error`
|
||||||
|
* callback if the Observable completes before any `next` notification was sent.
|
||||||
|
* @throws {SequenceError} Delivers a `SequenceError` if more than one value is
|
||||||
|
* emitted that matches the provided predicate. If no predicate is provided, it
|
||||||
|
* will deliver a `SequenceError` if more than one value comes from the source.
|
||||||
|
* @throws {EmptyError} Delivers an `EmptyError` if no values were `next`ed prior
|
||||||
|
* to completion.
|
||||||
|
*
|
||||||
|
* @param predicate A predicate function to evaluate items emitted by the source
|
||||||
|
* Observable.
|
||||||
|
* @return A function that returns an Observable that emits the single item
|
||||||
|
* emitted by the source Observable that matches the predicate.
|
||||||
|
*/
|
||||||
|
export function single<T>(predicate?: (value: T, index: number, source: Observable<T>) => boolean): MonoTypeOperatorFunction<T> {
|
||||||
|
return operate((source, subscriber) => {
|
||||||
|
let hasValue = false;
|
||||||
|
let singleValue: T;
|
||||||
|
let seenValue = false;
|
||||||
|
let index = 0;
|
||||||
|
source.subscribe(
|
||||||
|
createOperatorSubscriber(
|
||||||
|
subscriber,
|
||||||
|
(value) => {
|
||||||
|
seenValue = true;
|
||||||
|
if (!predicate || predicate(value, index++, source)) {
|
||||||
|
hasValue && subscriber.error(new SequenceError('Too many matching values'));
|
||||||
|
hasValue = true;
|
||||||
|
singleValue = value;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
() => {
|
||||||
|
if (hasValue) {
|
||||||
|
subscriber.next(singleValue);
|
||||||
|
subscriber.complete();
|
||||||
|
} else {
|
||||||
|
subscriber.error(seenValue ? new NotFoundError('No matching values') : new EmptyError());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"zip.js","sourceRoot":"","sources":["../../../../src/internal/operators/zip.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;AAAA,yCAAqD;AAErD,qCAAuC;AAmBvC,SAAgB,GAAG;IAAO,iBAAwE;SAAxE,UAAwE,EAAxE,qBAAwE,EAAxE,IAAwE;QAAxE,4BAAwE;;IAChG,OAAO,cAAO,CAAC,UAAC,MAAM,EAAE,UAAU;QAChC,SAAS,8BAAC,MAA8B,UAAM,OAAuC,IAAE,SAAS,CAAC,UAAU,CAAC,CAAC;IAC/G,CAAC,CAAC,CAAC;AACL,CAAC;AAJD,kBAIC"}
|
||||||
@ -0,0 +1,4 @@
|
|||||||
|
'use strict';
|
||||||
|
|
||||||
|
/** @type {import('./reflectApply')} */
|
||||||
|
module.exports = typeof Reflect !== 'undefined' && Reflect && Reflect.apply;
|
||||||
@ -0,0 +1,27 @@
|
|||||||
|
import { entityKind } from "../../entity.js";
|
||||||
|
import { PgColumn, PgColumnBuilder } from "./common.js";
|
||||||
|
class PgBooleanBuilder extends PgColumnBuilder {
|
||||||
|
static [entityKind] = "PgBooleanBuilder";
|
||||||
|
constructor(name) {
|
||||||
|
super(name, "boolean", "PgBoolean");
|
||||||
|
}
|
||||||
|
/** @internal */
|
||||||
|
build(table) {
|
||||||
|
return new PgBoolean(table, this.config);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class PgBoolean extends PgColumn {
|
||||||
|
static [entityKind] = "PgBoolean";
|
||||||
|
getSQLType() {
|
||||||
|
return "boolean";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function boolean(name) {
|
||||||
|
return new PgBooleanBuilder(name ?? "");
|
||||||
|
}
|
||||||
|
export {
|
||||||
|
PgBoolean,
|
||||||
|
PgBooleanBuilder,
|
||||||
|
boolean
|
||||||
|
};
|
||||||
|
//# sourceMappingURL=boolean.js.map
|
||||||
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"sources":["../../src/pg-proxy/driver.ts"],"sourcesContent":["import { entityKind } from '~/entity.ts';\nimport { DefaultLogger } from '~/logger.ts';\nimport { PgDatabase } from '~/pg-core/db.ts';\nimport { PgDialect } from '~/pg-core/dialect.ts';\nimport {\n\tcreateTableRelationsHelpers,\n\textractTablesRelationalConfig,\n\ttype RelationalSchemaConfig,\n\ttype TablesRelationalConfig,\n} from '~/relations.ts';\nimport type { DrizzleConfig } from '~/utils.ts';\nimport { type PgRemoteQueryResultHKT, PgRemoteSession } from './session.ts';\n\nexport class PgRemoteDatabase<\n\tTSchema extends Record<string, unknown> = Record<string, never>,\n> extends PgDatabase<PgRemoteQueryResultHKT, TSchema> {\n\tstatic override readonly [entityKind]: string = 'PgRemoteDatabase';\n}\n\nexport type RemoteCallback = (\n\tsql: string,\n\tparams: any[],\n\tmethod: 'all' | 'execute',\n\ttypings?: any[],\n) => Promise<{ rows: any[] }>;\n\nexport function drizzle<TSchema extends Record<string, unknown> = Record<string, never>>(\n\tcallback: RemoteCallback,\n\tconfig: DrizzleConfig<TSchema> = {},\n\t_dialect: () => PgDialect = () => new PgDialect({ casing: config.casing }),\n): PgRemoteDatabase<TSchema> {\n\tconst dialect = _dialect();\n\tlet logger;\n\tif (config.logger === true) {\n\t\tlogger = new DefaultLogger();\n\t} else if (config.logger !== false) {\n\t\tlogger = config.logger;\n\t}\n\n\tlet schema: RelationalSchemaConfig<TablesRelationalConfig> | undefined;\n\tif (config.schema) {\n\t\tconst tablesConfig = extractTablesRelationalConfig(\n\t\t\tconfig.schema,\n\t\t\tcreateTableRelationsHelpers,\n\t\t);\n\t\tschema = {\n\t\t\tfullSchema: config.schema,\n\t\t\tschema: tablesConfig.tables,\n\t\t\ttableNamesMap: tablesConfig.tableNamesMap,\n\t\t};\n\t}\n\n\tconst session = new PgRemoteSession(callback, dialect, schema, { logger, cache: config.cache });\n\tconst db = new PgRemoteDatabase(dialect, session, schema as any) as PgRemoteDatabase<TSchema>;\n\t(<any> db).$cache = config.cache;\n\tif ((<any> db).$cache) {\n\t\t(<any> db).$cache['invalidate'] = config.cache?.onMutate;\n\t}\n\n\treturn db;\n}\n"],"mappings":"AAAA,SAAS,kBAAkB;AAC3B,SAAS,qBAAqB;AAC9B,SAAS,kBAAkB;AAC3B,SAAS,iBAAiB;AAC1B;AAAA,EACC;AAAA,EACA;AAAA,OAGM;AAEP,SAAsC,uBAAuB;AAEtD,MAAM,yBAEH,WAA4C;AAAA,EACrD,QAA0B,UAAU,IAAY;AACjD;AASO,SAAS,QACf,UACA,SAAiC,CAAC,GAClC,WAA4B,MAAM,IAAI,UAAU,EAAE,QAAQ,OAAO,OAAO,CAAC,GAC7C;AAC5B,QAAM,UAAU,SAAS;AACzB,MAAI;AACJ,MAAI,OAAO,WAAW,MAAM;AAC3B,aAAS,IAAI,cAAc;AAAA,EAC5B,WAAW,OAAO,WAAW,OAAO;AACnC,aAAS,OAAO;AAAA,EACjB;AAEA,MAAI;AACJ,MAAI,OAAO,QAAQ;AAClB,UAAM,eAAe;AAAA,MACpB,OAAO;AAAA,MACP;AAAA,IACD;AACA,aAAS;AAAA,MACR,YAAY,OAAO;AAAA,MACnB,QAAQ,aAAa;AAAA,MACrB,eAAe,aAAa;AAAA,IAC7B;AAAA,EACD;AAEA,QAAM,UAAU,IAAI,gBAAgB,UAAU,SAAS,QAAQ,EAAE,QAAQ,OAAO,OAAO,MAAM,CAAC;AAC9F,QAAM,KAAK,IAAI,iBAAiB,SAAS,SAAS,MAAa;AAC/D,EAAO,GAAI,SAAS,OAAO;AAC3B,MAAW,GAAI,QAAQ;AACtB,IAAO,GAAI,OAAO,YAAY,IAAI,OAAO,OAAO;AAAA,EACjD;AAEA,SAAO;AACR;","names":[]}
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user