Skip to content

Commit 7a48803

Browse files
committed
Update README and bump to 1.6.7
1 parent 1297080 commit 7a48803

File tree

2 files changed

+94
-66
lines changed

2 files changed

+94
-66
lines changed

README.md

Lines changed: 91 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -9,15 +9,15 @@ Wrap an [async](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Referenc
99
1. Two usages
1010
1. D4C instance: synchronization mode & concurrency mode.
1111
2. Class, instance, and static method decorators on classes: synchronization mode & concurrency mode.
12-
2. This library implements a FIFO task queue for O(1) speed. Using built-in JavaScript array will have O(n) issue.
13-
3. Optional parameter, `inheritPreErr`. If current task is waiting for previous tasks, set it as `true` to inherit the error of the previous task and the task will not be executed and throw a custom error `new PreviousError(task.preError.message ?? task.preError)`. If this parameter is omitted or set as `false`, the task will continue whether previous tasks happen errors or not.
14-
4. Optional parameter, `noBlockCurr`. Set it as `true` to forcibly execute the current task in the another (microtask) execution of the event loop. This is useful if you pass a sync function as the first task but do not want it to block the current event loop.
12+
2. Wrap a function to a new queue-ready async function. It is convenient to re-use this function. Also, it is able to pass arguments and get return value for each task function.
13+
3. Support `async function`, a `promise-returning` function, and a `sync` function.
14+
4. Sub queues system (via tags).
1515
5. Support Browser and Node.js.
1616
6. Fully Written in TypeScript and its `.d.ts` typing is out of box. JavaScript is supported, too.
17-
7. Wrap a function to a new queue-ready async function. It is convenient to re-use this function. Also, it is able to pass arguments and get return value for each task function.
18-
8. Support `async function`, a `promise-returning` function, and a `sync` function.
19-
9. Sub queues system (via tags).
20-
10. Well tested.
17+
7. This library implements a FIFO task queue for O(1) speed. Using built-in JavaScript array will have O(n) issue.
18+
8. Well tested.
19+
9. Optional parameter, `inheritPreErr`. If current task is waiting for previous tasks, set it as `true` to inherit the error of the previous task and the task will not be executed and throw a custom error `new PreviousError(task.preError.message ?? task.preError)`. If this parameter is omitted or set as `false`, the task will continue whether previous tasks happen errors or not.
20+
10. Optional parameter, `noBlockCurr`. Set it as `true` to forcibly execute the current task in the another (microtask) execution of the event loop. This is useful if you pass a sync function as the first task but do not want it to block the current event loop.
2121

2222
## Installation
2323

@@ -31,13 +31,13 @@ Either `npm install d4c-queue` or `yarn add d4c-queue`. Then import this package
3131
**ES6 import**
3232

3333
```typescript
34-
import { D4C, synchronized, QConcurrency, concurrent } from 'd4c-queue';
34+
import { D4C, synchronized, QConcurrency, concurrent } from 'd4c-queue'
3535
```
3636

3737
**CommonJS**
3838

3939
```typescript
40-
const { D4C, synchronized, QConcurrency, concurrent } = require('d4c-queue');
40+
const { D4C, synchronized, QConcurrency, concurrent } = require('d4c-queue')
4141
```
4242

4343
It is possible to use the `module` build with CommonJS require syntax in TypeScript or other build tools.
@@ -99,7 +99,7 @@ D4C instance queues (per D4C object):
9999
#### Synchronization mode
100100

101101
```typescript
102-
const d4c = new D4C();
102+
const d4c = new D4C()
103103

104104
/**
105105
* in some execution of event loop
@@ -108,20 +108,20 @@ const d4c = new D4C();
108108
const asyncFunResult = await d4c.wrap(asyncFun)(
109109
'asyncFun_arg1',
110110
'asyncFun_arg2'
111-
);
111+
)
112112
/**
113113
* in another execution of event loop. Either async or
114114
* sync function is ok. E.g., pass a sync function,
115115
* it will wait for asyncFun's finishing, then use await to get
116116
* the new wrapped async function's result.
117117
*/
118-
const syncFunFunResult = await d4c.wrap(syncFun)('syncFun_arg1');
118+
const syncFunFunResult = await d4c.wrap(syncFun)('syncFun_arg1')
119119
```
120120

121121
Alternatively, you can use below
122122

123123
```typescript
124-
d4c.apply(syncFun, { args: ['syncFun_arg1'] });
124+
d4c.apply(syncFun, { args: ['syncFun_arg1'] })
125125
```
126126

127127
#### Concurrency mode
@@ -134,21 +134,21 @@ Usage:
134134

135135
```ts
136136
/** change concurrency limit applied on default queues */
137-
const d4c = new D4C([{ limit: 100 }]);
137+
const d4c = new D4C([{ limit: 100 }])
138138

139139
/** setup concurrency for specific queue: "2" */
140-
const d4c = new D4C([{ limit: 100, tag: '2' }]);
140+
const d4c = new D4C([{ limit: 100, tag: '2' }])
141141
```
142142

143143
You can adjust concurrency via `setConcurrency`.
144144

145145
```ts
146-
const d4c = new D4C();
146+
const d4c = new D4C()
147147
/** change concurrency limit on default queue*/
148-
d4c.setConcurrency([{ limit: 10 }]);
148+
d4c.setConcurrency([{ limit: 10 }])
149149

150150
/** change concurrency limit for queue2 */
151-
d4c.setConcurrency([{ limit: 10, tag: 'queue2' }]);
151+
d4c.setConcurrency([{ limit: 10, tag: 'queue2' }])
152152
```
153153

154154
### Decorators usage
@@ -168,7 +168,7 @@ class ServiceAdapter {
168168
//** parameters are optional */
169169
@synchronized({ tag: 'world', inheritPreErr: true, noBlockCurr: true })
170170
static async staticMethod(text: string) {
171-
return text;
171+
return text
172172
}
173173
}
174174
```
@@ -221,10 +221,10 @@ class TestController {
221221
222222
bindMethodByArrowPropertyOrAutobind = async () => {
223223
/** access some property in this. accessible after wrapping*/
224-
};
224+
}
225225
}
226-
const d4c = new D4C();
227-
const res = await d4c.apply(testController.bindMethodByArrowPropertyOrAutobind);
226+
const d4c = new D4C()
227+
const res = await d4c.apply(testController.bindMethodByArrowPropertyOrAutobind)
228228
```
229229

230230
## Motivation and more detailed user scenario about Synchronization mode
@@ -252,10 +252,10 @@ class ServiceAdapter {
252252
async send_message(msg: string) {
253253
if (this.connectingStatus === 'Connected') {
254254
/** send message */
255-
await client_send_message_without_wait_connect(msg);
255+
await client_send_message_without_wait_connect(msg)
256256
} else if (this.connectingStatus === 'Connecting') {
257257
/** send message */
258-
await client_send_message_wait_connect(msg);
258+
await client_send_message_wait_connect(msg)
259259
} else {
260260
//..
261261
}
@@ -282,16 +282,16 @@ class ServiceAdapter {
282282
The code snippet is from [embedded-pydicom-react-viewer](https://github.com/grimmer0125/embedded-pydicom-react-viewer). Some function only can be executed after init function is finished.
283283
284284
```typescript
285-
const d4c = new D4C();
285+
const d4c = new D4C()
286286
export const initPyodide = d4c.wrap(async () => {
287287
/** init Pyodide*/
288-
});
288+
})
289289

290290
/** without d4c-queue, it will throw exception while being called
291291
* before 'initPyodide' is finished */
292292
export const parseByPython = d4c.wrap(async (buffer: ArrayBuffer) => {
293293
/** execute python code in browser */
294-
});
294+
})
295295
```
296296
297297
### Race condition
@@ -302,20 +302,20 @@ It is similar to causality. Sometimes two function which access same data within
302302
303303
```typescript
304304
const func1 = async () => {
305-
console.log("func1 start, execution1 in event loop")
306-
await func3();
307-
console.log('func1 end, should not be same event loop execution1');
308-
};
305+
console.log('func1 start, execution1 in event loop')
306+
await func3()
307+
console.log('func1 end, should not be same event loop execution1')
308+
}
309309

310310
const func2 = async () => {
311-
console.log('func2');
312-
};
311+
console.log('func2')
312+
}
313313

314314
async function testRaceCondition() {
315-
func1(); // if add await will result in no race condition
316-
func2();
315+
func1() // if add await will result in no race condition
316+
func2()
317317
}
318-
testRaceCondition();
318+
testRaceCondition()
319319
```
320320
321321
`func2` will be executed when `func1` is not finished.
@@ -330,7 +330,7 @@ No race condition on two API call in `Express`, any API will be executed one by
330330
/** Express case */
331331
app.post('/testing', async (req, res) => {
332332
// Do something here
333-
});
333+
})
334334
```
335335
336336
However, race condition may happen on two API call in `Apollo`/`NestJS`.
@@ -344,7 +344,7 @@ const resolvers = {
344344
Query: {
345345
books: async () => books,
346346
},
347-
};
347+
}
348348
```
349349
350350
Two Apollo GraphQL queries/mutations may be executed concurrently, not like Express. This has advantage and disadvantage. If you need to worry about the possible race condition, you can consider this `d4c-queue` library, or `Database transaction` or [async-mutex](https://www.npmjs.com/package/async-mutex). You do not need to apply `d4c-queue` library on top API endpoint always, just apply on the place you worry about.
@@ -354,15 +354,15 @@ Two Apollo GraphQL queries/mutations may be executed concurrently, not like Expr
354354
The below shows how to make `hello query` become `synchronized`. Keep in mind that `@synchronized` should be below `@Query`.
355355
356356
```typescript
357-
import { Query } from '@nestjs/graphql';
358-
import { synchronized } from 'd4c-queue';
357+
import { Query } from '@nestjs/graphql'
358+
import { synchronized } from 'd4c-queue'
359359

360360
function delay() {
361361
return new Promise<string>(function (resolve, reject) {
362362
setTimeout(function () {
363-
resolve('world');
364-
}, 10 * 1000);
365-
});
363+
resolve('world')
364+
}, 10 * 1000)
365+
})
366366
}
367367

368368
export class TestsResolver {
@@ -372,10 +372,10 @@ export class TestsResolver {
372372
*/
373373
@synchronized
374374
async hello() {
375-
console.log('hello graphql resolver part: 1/2');
376-
const resp = await delay();
377-
console.log('hello graphql resolver part: 2/2');
378-
return resp;
375+
console.log('hello graphql resolver part: 1/2')
376+
const resp = await delay()
377+
console.log('hello graphql resolver part: 2/2')
378+
return resp
379379
}
380380
}
381381
```
@@ -425,9 +425,9 @@ setup a array of queue settings
425425
// use with @concurrent
426426
function QConcurrency(
427427
queuesParam: Array<{
428-
limit: number;
429-
tag?: string | symbol;
430-
isStatic?: boolean;
428+
limit: number
429+
tag?: string | symbol
430+
isStatic?: boolean
431431
}>
432432
) {}
433433

@@ -443,16 +443,16 @@ class TestController {}
443443
444444
```typescript
445445
function synchronized(option?: {
446-
inheritPreErr?: boolean;
447-
noBlockCurr?: boolean;
448-
tag?: string | symbol;
446+
inheritPreErr?: boolean
447+
noBlockCurr?: boolean
448+
tag?: string | symbol
449449
}) {}
450450

451451
/** default concurrency limit is Infinity, // use with @QConcurrency */
452452
function concurrent(option?: {
453-
tag?: string | symbol;
454-
inheritPreErr?: boolean;
455-
noBlockCurr?: boolean;
453+
tag?: string | symbol
454+
inheritPreErr?: boolean
455+
noBlockCurr?: boolean
456456
}) {}
457457
```
458458
@@ -487,21 +487,21 @@ usage:
487487
488488
```typescript
489489
/** default concurrency is 1*/
490-
const d4c = new D4C();
490+
const d4c = new D4C()
491491

492492
/** concurrency limit 500 applied on default queues */
493-
const d4c = new D4C([{ limit: 500 }]);
493+
const d4c = new D4C([{ limit: 500 }])
494494

495495
/** setup concurrency for specific queue: "2" */
496-
const d4c = new D4C([{ limit: 100, tag: '2' }]);
496+
const d4c = new D4C([{ limit: 100, tag: '2' }])
497497
```
498498
499499
- setConcurrency
500500
501501
```ts
502-
d4c.setConcurrency([{ limit: 10 }]);
502+
d4c.setConcurrency([{ limit: 10 }])
503503

504-
d4c.setConcurrency([{ limit: 10, tag: 'queue2' }]);
504+
d4c.setConcurrency([{ limit: 10, tag: 'queue2' }])
505505
```
506506
507507
- wrap
@@ -545,7 +545,7 @@ newFunc("asyncFun_arg1", "asyncFun_arg2");)
545545
becomes
546546
547547
```typescript
548-
d4c.apply(asyncFun, { args: ['asyncFun_arg1'], tag: 'queue1' });
548+
d4c.apply(asyncFun, { args: ['asyncFun_arg1'], tag: 'queue1' })
549549
```
550550
551551
## Changelog
@@ -591,10 +591,37 @@ module.exports = {
591591
plugins: [['@babel/plugin-proposal-decorators', { legacy: true }]],
592592
loaderOptions: {},
593593
loaderOptions: (babelLoaderOptions, { env, paths }) => {
594-
return babelLoaderOptions;
594+
return babelLoaderOptions
595595
},
596596
},
597-
};
597+
}
598+
```
599+
600+
### Angular Service example
601+
602+
```
603+
import { Injectable } from '@angular/core';
604+
import { QConcurrency, concurrent } from 'd4c-queue';
605+
606+
// can be placed below @Injectable, too
607+
@QConcurrency([
608+
{ limit: 1 }
609+
])
610+
@Injectable({
611+
providedIn: 'root'
612+
})
613+
export class HeroService {
614+
615+
@concurrent
616+
async task1() {
617+
await wait(5 * 1000);
618+
}
619+
620+
@concurrent
621+
async task2() {
622+
await wait(1 * 1000);
623+
}
624+
}
598625
```
599626
600627
### Use latest GitHub code of this library

package.json

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "d4c-queue",
3-
"version": "1.6.5",
3+
"version": "1.6.7",
44
"description": "A task queue executes tasks sequentially or concurrently. Wrap an async/promise-returning/sync function as a queue-ready async function for easy reusing. Support passing arguments/getting return value, @synchronized/@concurrent decorator, Node.js/Browser.",
55
"main": "build/main/index.js",
66
"typings": "build/main/index.d.ts",
@@ -47,7 +47,8 @@
4747
"task-queue",
4848
"tasks",
4949
"task-runner",
50-
"microtask"
50+
"microtask",
51+
"angular"
5152
],
5253
"scripts": {
5354
"build": "run-p build:*",

0 commit comments

Comments
 (0)