Skip to content

Commit 35faf0a

Browse files
committed
Merge branch 'release/v1.0.0'
2 parents f647532 + 65ffe68 commit 35faf0a

File tree

101 files changed

+45923
-21436
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

101 files changed

+45923
-21436
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
node_modules/
2+
coverage/
23
*~
34
*.swp
45
.idea

README.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,24 +2,24 @@
22

33
##Description
44

5-
HyperFlow provides a model of computation and an execution engine for complex, distributed workflow applications which consist of a set of **processes** performing well-defined **functions** and exchanging **signals**.
5+
HyperFlow provides a model of computation, workflow description language and enactment engine for complex, distributed workflows.
66

77
Browse the [wiki pages](https://github.com/balis/hyperflow/wiki) to learn more about the HyperFlow workflow model.
88

99
##Getting started
1010

11-
Latest release of HyperFlow is 1.0.0-beta-6
11+
The latest release of HyperFlow is 1.0.0
1212

13-
Installation & running:
14-
* Download the package: https://github.com/dice-cyfronet/hyperflow/archive/v1.0.0-beta-6.zip
15-
* Install dependencies (in `hyperflow` directory): `npm install -d`
13+
###Installation
14+
* Download the package: https://github.com/dice-cyfronet/hyperflow/archive/v1.0.0.zip
1615
* Install the latest node.js (http://nodejs.org)
16+
* Install dependencies (in `hyperflow` directory): `npm install -d`
1717
* Install the Redis server 2.6.x or higher (http://redis.io) (tested with version 2.6.x)
18-
* Start the redis server
19-
* Run example workflows: `node scripts/runwf.js -f workflows/<workflow_file>`
20-
* Try `Wf_grepfile_simple.json`, `Wf_MapReduce.json`, `Wf_PingPong.json`
21-
* Also try simulated `Montage` workflows which require the `-s` flag:
22-
* `node scripts/runwf.js -f workflows/Montage_143.json -s`
23-
* Look at sample workflows in the `workflows` directory
24-
* Look at example functions invoked from workflow tasks in the `functions` directory
18+
* Start the redis server.
19+
* Set an environment variable `HFLOW_PATH` to point to your hyperflow root directory.
20+
* Add `$HFLOW_PATH/bin` to your `PATH`.
21+
22+
###Running
23+
Run example workflows from the `examples` directory as follows:
2524

25+
```hflow run <wf_directory>```

SPEC.md

Lines changed: 95 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,95 @@
1+
# HyperFlow: a distributed workflow execution engine
2+
3+
HyperFlow provides a model of computation and an execution engine for complex, distributed [workflow](http://en.wikipedia.org/wiki/Workflow) applications which consist of a set of **processes** performing well-defined **functions** and exchanging **signals**. Browse the [wiki pages](https://github.com/dice-cyfronet/hyperflow/wiki) to learn more about the HyperFlow workflow model.
4+
5+
6+
## Getting started
7+
8+
### Installing Hyperflow
9+
10+
Hyperflow requires [node.js](http://nodejs.org) runtime. Stable version may be installed using npm package manager:
11+
12+
```shell
13+
$ npm install -g hyperflow
14+
```
15+
16+
You can install bleeding-edge from GitHub:
17+
18+
```shell
19+
$ npm install -g https://github.com/dice-cyfronet/hyperflow/archive/develop.tar.gz
20+
```
21+
22+
Hyperflow also requires [Redis](http://redis.io) server.
23+
24+
```shell
25+
# on Debian/Ubuntu
26+
$ sudo apt-get install redis-server
27+
28+
# on RedHat or CentOS
29+
$ sudo yum install redis
30+
```
31+
32+
### Installing additional modules
33+
34+
`hyperflow` package provides only core functionality, while additional packages can bundle *functions* and *graphs*. The functions may be later referenced from workflow graph as `$npm_package_name:$function_name`.
35+
36+
We provide:
37+
38+
* `hyperflow-amqp` – allows remote execution of tasks by using AMQP queues,
39+
* `hyperflow-map-reduce` – functions for constructing Map-Reduce workflows,
40+
* `hyperflow-pegasus` – support for executing [Pegasus](http://...) DAX workflows,
41+
* `hyperflow-montage` – support for executing [Montage](http://...) workflow.
42+
43+
See [wiki page](http://...) to see how to create hyperflow function packages.
44+
45+
### Running *hello world* workflow
46+
47+
```shell
48+
$ git clone http://github.com/dice-cyfronet/hyperflow-hello-world.git
49+
$ cd hyperflow-hello-world
50+
$ hflow start
51+
Hyperflow starting!
52+
Listening on *:1234, webui: http://1.2.3.4:1234/
53+
hello-world workflow loaded, sending initial signals.
54+
Workflow id is 9876.
55+
```
56+
### Advanced options
57+
58+
```
59+
hflow start [--background] [--functions functions.js] [--graph graph.json|graph.js] [--config config.json] [--set-KEY=VALUE]
60+
hflow resume [workflow_id] [--functions functions.js] [--graph graph.json|graph.js] [--config config.json] [--set-KEY=VALUE]
61+
hflow terminate [workflow_id]
62+
hflow status [workflow_id]
63+
hflow watch_events [workflow_id]
64+
```
65+
66+
### Workflow directory structure
67+
68+
Workflow is a directory that bundles all files required and contains:
69+
70+
* workflow graph:
71+
* `graph.json` – static workflow graph in JSON, or
72+
* `graph.js` – graph generation code as node.js module,
73+
* `config.json` – hyperflow configuration and workflow parameters,
74+
* `functions.js` – functions specific for given workflow.
75+
76+
## Configuration
77+
78+
Configuration is provided in JSON format, while some options may be also specified as environment variables. Hyperflow reads and merges the config in the following order:
79+
80+
* defaults (see [default_config.json](default_config.json)),
81+
* `/etc/hyperflow.json`,
82+
* `~/.hyperflow.json`,
83+
* `hyperflow.json` placed in the same directory as workflow JSON file,
84+
* `$HYPERFLOW_CONFIG`,
85+
* options from environment variables e.g. `$REDIS_URL`,
86+
* options from command line arguments.
87+
88+
Options are:
89+
90+
* `packages` – list of function packages that are required by workflow (e.g. `montage/functions`),
91+
* `graph` – filename of graph file (defaults to `./graph.[js|json]`), may use also bundled graphs (e.g. `montage/graph`),
92+
* `port` or `$PORT` (defaults to 1234),
93+
* `redis_url` or `$REDIS_URL` (defaults to: `redis://127.0.0.1:6379/0`),
94+
* `amqp_url` or `$AMQP_URL` (defaults to `amqp://localhost`),
95+
* `amqp_executor_config` (defaults to `{"storage": "local", "workdir": "/tmp/"}`).

bin/hflow

Lines changed: 91 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,91 @@
1+
#!/usr/bin/env node
2+
var docopt = require('docopt').docopt,
3+
spawn = require('child_process').spawn,
4+
fs = require('fs'),
5+
pathtool = require('path'),
6+
redis = require('redis'),
7+
rcl = redis.createClient(),
8+
wflib = require('../wflib').init(rcl),
9+
Engine = require('../engine2'),
10+
async = require('async'),
11+
dbId = 0;
12+
13+
var doc = "\
14+
Usage:\n\
15+
hflow run <workflow_dir_or_file> [-s]\n\
16+
hflow send <wf_id> ( <signal_file> | -d <signal_data> )\n\
17+
hflow -h | --help | --version";
18+
19+
var opts = docopt(doc);
20+
21+
var hfroot = process.env.HFLOW_PATH;
22+
23+
if (opts.run) {
24+
hflow_run();
25+
} else if (opts.send) {
26+
hflow_send();
27+
}
28+
29+
function hflow_run() {
30+
var wfpath = opts['<workflow_dir_or_file>'],
31+
wfstats = fs.lstatSync(wfpath),
32+
wffile;
33+
34+
if (wfstats.isDirectory()) {
35+
wffile = pathtool.join(wfpath, "workflow.json");
36+
} else if (wfstats.isFile()) {
37+
wffile = wfpath;
38+
wfpath = pathtool.dirname(wfpath);
39+
}
40+
41+
var runWf = function(wfId) {
42+
var config = {"emulate":"false", "workdir": pathtool.resolve(wfpath)};
43+
var engine = new Engine(config, wflib, wfId, function(err) {
44+
// This represent custom plugin listening on event from available eventServer
45+
// engine.eventServer.on('trace.*', function(exec, args) {
46+
// console.log('Event captured: ' + exec + ' ' + args + ' job done');
47+
// });
48+
engine.runInstance(function(err) {
49+
console.log("Wf id="+wfId);
50+
if (opts['-s']) {
51+
// Flag -s is present: send all input signals to the workflow -> start execution
52+
wflib.getWfIns(wfId, false, function(err, wfIns) {
53+
engine.wflib.getSignalInfo(wfId, wfIns, function(err, sigs) {
54+
engine.emitSignals(sigs);
55+
});
56+
});
57+
}
58+
});
59+
});
60+
};
61+
62+
var createWf = function(cb) {
63+
rcl.select(dbId, function(err, rep) {
64+
rcl.flushdb(function(err, rep) {
65+
wflib.createInstanceFromFile(wffile, '', function(err, id) {
66+
cb(err, id);
67+
});
68+
});
69+
});
70+
}
71+
72+
createWf(function(err, wfId) {
73+
runWf(wfId);
74+
});
75+
}
76+
77+
function hflow_send() {
78+
console.log("send signal to a workflow: not implemented");
79+
}
80+
81+
function spawn_proc(exec, args) {
82+
var proc = spawn(exec, args);
83+
84+
proc.stdout.on('data', function(data) {
85+
console.log(data.toString().trimRight());
86+
});
87+
88+
proc.stderr.on('data', function(data) {
89+
console.log(data.toString().trimRight());
90+
});
91+
}

bin/hflow-run.js

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
#!/usr/bin/env node
2+
3+
/* HyperFlow workflow engine.
4+
* Bartosz Balis, 2013-2014
5+
* hflow-run.js:
6+
* - creates a Hyperflow engine instance for workflow identified by Redis id
7+
* - runs this workflow: at this point the engine is awaiting signals (unless -s flag is given)
8+
**/
9+
10+
var redis = require('redis'),
11+
rcl = redis.createClient(),
12+
wflib = require('../wflib').init(rcl),
13+
Engine = require('../engine2'),
14+
async = require('async'),
15+
argv = require('optimist').argv,
16+
dbId = 0,
17+
engine;
18+
19+
function createWf(cb) {
20+
rcl.select(dbId, function(err, rep) {
21+
rcl.flushdb(function(err, rep) {
22+
wflib.createInstanceFromFile(argv.f, '',
23+
function(err, id) {
24+
cb(err, id);
25+
}
26+
);
27+
});
28+
});
29+
}
30+
31+
32+
if (!argv.id && !argv.f) {
33+
console.log("hflow-run.js: runs a workflow instance\n");
34+
console.log("Usage: node hflow-run.js [-f </path/to/wf.json>] [-i WFID] [--db=DBID]");
35+
console.log(" -f <file> : create a new wf instance from a file and run it");
36+
console.log(" -i WFID : use already created wf instance with WFID as its Redis id");
37+
console.log(" -s : send input signals to the workflow (starts execution)");
38+
console.log(" -d DBID : Redis db number to be used (default=0)");
39+
process.exit();
40+
}
41+
42+
if (argv.d) {
43+
dbId = argv.d;
44+
console.log("DBID", dbId);
45+
}
46+
47+
var runWf = function(wfId) {
48+
engine = new Engine({"emulate":"false"}, wflib, wfId, function(err) {
49+
//This represent custom plugin listening on event from available eventServer
50+
// engine.eventServer.on('trace.*', function(exec, args) {
51+
// console.log('Event captured: ' + exec + ' ' + args + ' job done');
52+
// });
53+
engine.runInstance(function(err) {
54+
console.log("Wf id="+wfId);
55+
if (argv.s) {
56+
// Flag -s is present: send all input signals to the workflow -> start execution
57+
wflib.getWfIns(wfId, false, function(err, wfIns) {
58+
engine.wflib.getSignalInfo(wfId, wfIns, function(err, sigs) {
59+
engine.emitSignals(sigs);
60+
});
61+
});
62+
}
63+
});
64+
});
65+
};
66+
67+
if (argv.f) {
68+
createWf(function(err, wfId) {
69+
runWf(wfId);
70+
});
71+
} else if (argv.i) {
72+
runWf(argv.i);
73+
}
74+

scripts/runwf.js renamed to bin/hyperflow

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
#!/usr/bin/env node
2+
13
/* Hypermedia workflow
24
* Bartosz Balis, 2013
35
* runwf:

bin/hyperflow-convert-dax

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
#!/usr/bin/env node
2+
3+
var PegasusConverter = require('../converters/pegasus_dax.js'),
4+
argv = require('optimist').argv;
5+
var pc;
6+
7+
if (!argv._[0]) {
8+
console.log("Usage: node dax_convert.js <DAX file path> [command_name]");
9+
console.log(" command_name can be: amqpCommand, command_print or command... etc ");
10+
process.exit();
11+
}
12+
13+
if (argv._[1]) {
14+
pc = new PegasusConverter(argv._[1]);
15+
} else {
16+
pc = new PegasusConverter();
17+
}
18+
19+
pc.convertFromFile(argv._[0], function (err, wfOut) {
20+
console.log(JSON.stringify(wfOut, null, 2));
21+
});
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
#!/usr/bin/env node
12
/* Hypermedia workflow
23
* Bartosz Balis, 2013
34
* createwf: creates workflow state in Redis db from a json file

0 commit comments

Comments
 (0)