Compare commits

...

64 Commits

Author SHA1 Message Date
deac844d02 Add testing protocol. 2020-09-05 19:16:00 -05:00
80a3ba4621 Add more descriptive message for 404s. 2019-02-21 23:46:45 -06:00
Jonathan Bernard
774d0b446f Fix typo in active runs API, add functional test for same. 2018-12-26 23:43:44 -06:00
Jonathan Bernard
ee1147a1a5 Add configurability of the server port. 2018-12-23 18:20:22 -06:00
Jonathan Bernard
186b7d5b29 Fix /ping unit test (now /version) 2018-12-23 17:42:43 -06:00
Jonathan Bernard
52eaa63f25 Rework build configuration to take advantage of new built-in docker build capabilities. 2018-12-23 17:39:04 -06:00
Jonathan Bernard
e61fe3b01e Add functional tests for docker-based build configurations. 2018-12-23 16:08:36 -06:00
Jonathan Bernard
e83e64273b Minor fixes to unused test file. 2018-12-21 21:12:15 -06:00
Jonathan Bernard
b2d4df0aac WIP Upgrading to Nim 0.19. Getting docker pieces compiling.
* Addressing breaking changes in migration from Nim 0.18 to 0.19.
* Finishing the initial pass at the refactor required to include
  docker-based builds.
* Regaining confidence in the existing functionality by getting all
  tests passing again after docker introduction (still need new tests to
  cover new docker functionality).
2018-12-09 07:09:23 -06:00
Jonathan Bernard
c827beab5e WIP Adding native support for docker. 2017-12-02 20:47:26 -06:00
Jonathan Bernard
0574f0ec6a Include the version being built in BuildStatus objects. 2017-12-02 19:08:29 -06:00
Jonathan Bernard
ce7d4b60de Fix unit tests for latest changes to API. 2017-12-01 09:45:10 -06:00
Jonathan Bernard
2622877db5 Bump version number to 0.4.0 2017-12-01 07:11:17 -06:00
Jonathan Bernard
c6be698572 Artifact and log lookups. Bugfixes around failure scenarios.
* Added endopint to return the logs from a run.
* Refactored the `pathVar & "/" & pathvar` pattern into `pathVar / pathVar`
  using the `ospaths` module. Cleaner code, more resistant to extra `/` bugs.
* Added endpoints and core methods to list artifacts for a build, as well as to
  retrieve specific artifacts.
* Fixed a problem with the `complete` status being overloaded. The problem was
  that in the case of a multi-step build all of the prerequisite steps will
  return a state of `complete` which will get recorded in the status file for
  the run. The run will continue, but anyone watching the run state file (via
  the API for example) had no definitive way to tell the difference between a
  sub-step completing and the requested (last) step completing. This was caught
  in the functional tests (race condition based on when it polls for the run
  status). The fix was to introduce a new build state: `stepComplete`. The
  inner `doRun` procedure uses this instead of `complete`. Now the only place
  that `complete` is set is at the end of the original call to `run`, right
  before the worker terminates. It checks the last result (from the originally
  requested step) and if this result is `stepComplete` it "finishes" the build
  by setting the state to `complete`. Because this is the only place where
  `complete` is set, an observer is now guaranteed not to see `complete` until
  all steps have run successfully.
* Fixed a long-standing bug with the request handling logic in error cases
  (like requested resources not being available). Issue has something to do
  with the way that `except` blocks become special when in an async context.
  The main jester routes block invokes the handlers in an async context. The
  effect is that one `except` block is fine, but adding more than one (to catch
  different exception types, for example) causes the return type of the route
  handler to change and not match what the outer block is expecting (a Future).
  The fix here is to wrap any exception discrimination within a single outer
  except block, re-raise the exception, and catch it inside this new
  synchronous context. Ex:

  ```nim
    try: someCall(mayFail)
    except:
      try: raise getCurrentException()
      except ExceptionType1:
        # do whatever...
      except ExceptionType2:
        # do whatever
      except:
        # general catch-all
    return true
  ```

  The return at the end is also part of the story. Jester's match handler
  allows a route to defer making a decision about whether it matches. If you
  return true from a route block Jester accepts the result as a matched route.
  If you return false, Jester discards the result and looks for another
  matching route. Normally this is taken care of by the `resp` templates
  provided by Jester, but invoking those templates within the except blocks
  also causes problems, so we manually setup the response and `return true` to
  tell Jester that, yes this route matched, use this response.
* Moved the `/service/debug/ping` endpoint back to `/ping` and removed the
  debug-only fence. I envision this as being useful as a simple healthcheck URL.
2017-12-01 07:09:35 -06:00
Jonathan Bernard
07037616ac Fix test targets in build definition. 2017-12-01 02:38:36 -06:00
Jonathan Bernard
b85cf8b367 Hacky dependency pinning to get passing builds.
There is some bug building in the docker image we use to build the project with
the latest version of https://github.com/yglukhov/nim-jwt so I'm pinning it to
commit hash 549aa1eb13b8ddc0c6861d15cc2cc5b52bcbef01 for now. Later versions
add an ifdef branch to support libssl 1.1 but for some reason that ifdef is set
wrong and it tries to build against the 1.1 API even though the image only has
the 1.0 API. I'm crossing my fingers and hoping that our base image supports
libssl 1.1 before I need to update this library.
2017-12-01 02:12:22 -06:00
Jonathan Bernard
741124b734 Expirementing with building strawboss in a docker container. 2017-11-30 17:06:05 -06:00
Jonathan Bernard
a4e6a4cb81 Add simple CLI client based on cURL. 2017-11-30 12:26:21 -06:00
Jonathan Bernard
dcf82d8999 Add build step to the build configurationmake a zipped distributable version.
* Rename previous build step to `compile`
2017-11-30 12:18:55 -06:00
6556a86209 Planning for next features. 2017-11-27 08:09:24 -06:00
Jonathan Bernard
ff7f570ab1 Added systemd unit file. 2017-11-25 20:44:53 -06:00
Jonathan Bernard
d1f04951e5 Updating strawboss project definition so we can self-build. 2017-11-25 19:49:41 -06:00
Jonathan Bernard
f87dcc344b Added support for long-lived API keys. 2017-11-25 19:38:18 -06:00
Jonathan Bernard
4edae250ba Added more functional tests, fix bugs discovered.
* Fixed the formatting of command line logging of strawboss workers.
* Fixed a bug in the (de)serialization of log levels in the strawboss service
  config file.
* Pulled `parseBuildStatus` logic out of `loadBuildStatus` so that we could
  parse a JSON that didn't come from a file.
* Added `parseRun` for Run objects.
* Moved `/ping` to `/service/debug/ping` for symmetry with
  `/service/debug/stop`
* Added functional tests of full builds.
2017-11-25 18:49:43 -06:00
Jonathan Bernard
58fbbc048c Fixed behavior of multi-step builds.
* Output from the main strawboss executable is properly directed to stdout
  and stderr.
* Added threshold logging to strawboss core functions.
* Fixed a bug in the way dependent steps were detected and executed.
  The logic for checking if prior steps had already been executed was only
  executed once when the initial step was prepared, not for any of the
  dependent steps. This logic has been moved into the main work block for
  executing steps.
* Renamed `initiateRun` to `run` and  `runStep` to `doRun` to be more accurate.
* Dependent steps get their owng, independent copy of the workspace.
* Updated the test project to provide a test target.
2017-11-24 20:29:41 -06:00
Jonathan Bernard
573903bda0 WIP Working on test coverage following refactor. 2017-11-23 07:43:27 -06:00
Jonathan Bernard
82a7b301ea Finished refactor to base the build process around explicit run instances.
* Implemented periodic maintenance window.
* Moved worker creation into the core module.
* Worker processes no longer create run requests, but read queued requests from
  the file system.
* Build status and logs have been moved into the StrawBoss data directory.
* An initial build status is recorded when the job is queued.
* Build status is recorded for build references as well as actual versions.
  So there will be a build status for "master", for example, that is
  overwritten whenever "master" is built for that step.
* RunRequests now include a timestamp.
* Added a Run object to contain both a RunRequest and the corresponding
  BuildStatus for that run.
* API endpoints that talk about runs now return Run objects instead of
  RunRequests.
* Moved all data layer operations into the core module so that the
  "database API" only lives in one place.
2017-11-23 07:30:48 -06:00
Jonathan Bernard
e000b37c35 WIP Moving back towards using named runs.
* Rename artifactsRepo -> buildDataDir to be more explicit about the fact that
  it holds more than just the artifacts.
* Revert removal of run ids.
* Move Worker definition into core as part of making the core responsible for
  accepting run requests.
* Make the core module more responsible for internal details of data structure
  and storage. External callers should not need to construct paths to
  artifacts, versions, etc. but should be able to call method in the core
  module to do this work for them.
* The working directory no longer contains anything but the checked-out code.
  All StrawBoss-specific data is stored by StrawBoss elsewhere.
* Add a regular maintenance cycle to the server module.
2017-11-22 10:47:04 -06:00
Jonathan Bernard
7aa0a69215 GET /api/project/<project-name> endpoint. 2017-11-20 20:18:17 -06:00
Jonathan Bernard
f222d859e6 WIP Adding GET /project/<projectName> endpoint. 2017-11-20 10:05:55 -06:00
Jonathan Bernard
6340b2fa49 Remove the concept of named, identifiable runs.
StarBoss is meant for building things checked into the repo It is also designed
around repeatable builds. So it makes the assumption that running a build step
for a specific version of a project will always result in the same output. So
runs are identified by the project, build step, and version.
2017-11-20 09:15:03 -06:00
Jonathan Bernard
6569564aa8 Update to work with latest Nim devel and cliutil updates. 2017-11-15 23:00:40 -06:00
e39c1186c8 Refactor utils out into cliutils package. 2017-08-15 14:30:03 -05:00
Jonathan Bernard
0a6023c656 Smalll documentation, TODOs. 2017-08-01 08:49:42 -05:00
Jonathan Bernard
1299311a4c Added test of build step running. 2017-06-14 01:06:43 -05:00
Jonathan Bernard
3d8454d486 Reworking runs to include an id, save the run request. 2017-05-11 10:51:06 -05:00
Jonathan Bernard
e2c3aeca09 Documentation for server module, stubbed out API methods. 2017-05-11 10:48:54 -05:00
Jonathan Bernard
f6b347a4ed Preliminary configuration for StrawBoss to build itself. 2017-05-11 10:46:56 -05:00
Jonathan Bernard
a1100f17d8 Fix bug around spawning worker processes.
We were expecting to find the path to the `strawboss` binary implicitly from
the environment, which meant that configuration was also implicit, and required
more setup. Now the path to the binary is explicit in the StrawBoss runtime
configuration, and the path to the configuration file can also be explicitly given.
2017-05-11 10:43:55 -05:00
Jonathan Bernard
42f37a21e6 Debug stack traces in core. Bugfix around directory creating ordering. 2017-05-11 10:39:38 -05:00
Jonathan Bernard
a7619a3048 Change default value logic for stepCmd and cmdInput (see README). 2017-05-11 10:38:28 -05:00
Jonathan Bernard
45f490c677 Clarification in the README around service vs. project configuration. 2017-05-11 10:36:45 -05:00
Jonathan Bernard
37682441ea Split testing into unit and functional tests.
* Split the `test` nimble task into `unittest` and `functest`, with
  corresponding test directories and test runners.
* Added documentation in README regarding building and testing StrawBoss.
* Created a small, simple test project for use in the functional tests.
* Added a `keepEnv` template in the server unit test code to make it easy to
  preserve the working environment for a single unit test to invistigate
  failures manually.
2017-05-10 11:44:46 -05:00
Jonathan Bernard
fd804a9aa8 Implemented list project versions endpoint. 2017-05-08 12:41:46 -05:00
Jonathan Bernard
2d4f1bfdd2 Fix logic bug in findProject(StrawBossConfig, string). 2017-05-08 12:40:24 -05:00
Jonathan Bernard
781eeb6a13 Change auth-token endpoint from GET to POST. 2017-05-08 12:39:38 -05:00
Jonathan Bernard
6aaca4a078 Change the auth handler code in the server to play better with the resp macro (again). 2017-05-08 12:38:32 -05:00
Jonathan Bernard
a6c6bcf37d Explicitly kill server processes after tests if they don't die gracefully. 2017-05-08 12:36:34 -05:00
Jonathan Bernard
411379cb8d StrawBossConfig object (de)serialization and tests. 2017-05-08 12:33:47 -05:00
Jonathan Bernard
13165879c5 Pulled sameContents function out into nim-langutils library. 2017-05-08 12:32:55 -05:00
Jonathan Bernard
1e2af48892 Implemented GET on /projects/<proj-id> and started unit tests. 2017-04-25 12:57:13 -05:00
Jonathan Bernard
e547ecd607 Code cleanup in server.nim 2017-04-25 12:55:48 -05:00
Jonathan Bernard
9d00d638db Add findProject for looking up projects from the StrawBossConfig object properly. 2017-04-25 12:54:08 -05:00
Jonathan Bernard
81674dfa3f Clarified language in the README around cached project configurations. 2017-04-25 12:52:33 -05:00
Jonathan Bernard
ec967ec2bf Added ProjectDef parsing code. Unit test for , authentication logic. 2017-04-24 16:31:58 -05:00
Jonathan Bernard
053ac8dc14 .gitignore: add runtestsbinary 2017-04-24 16:31:09 -05:00
Jonathan Bernard
d701460e91 Start adding actual HTTP tests. 2017-04-23 00:19:47 -05:00
Jonathan Bernard
b402a8eb6d Fix jester options (port, appName). 2017-04-23 00:19:32 -05:00
Jonathan Bernard
3e8bbb1676 Add debug switch and API endpoint to stop server when in debug mode. 2017-04-23 00:18:57 -05:00
Jonathan Bernard
06b8914e7b Change CLI to allow the config file to be specified as an option. 2017-04-23 00:16:40 -05:00
Jonathan Bernard
52b7d2f48b Implemented password hashing. Added and improved tests. 2017-03-24 01:04:39 -05:00
Jonathan Bernard
b5a70f6de0 WIP: tests, REST API support (auth). 2017-03-19 06:34:42 -05:00
Jonathan Bernard
2551affd4b Re-order README section to flow better. 2017-03-19 06:33:22 -05:00
Jonathan Bernard
2cfb91aaeb WIP Adding session auth and routes. 2017-03-17 23:34:33 -05:00
34 changed files with 2371 additions and 625 deletions

10
.editorconfig Normal file
View File

@ -0,0 +1,10 @@
[*]
charset=utf-8
end_of_line=lf
indent_style=space
indent_size=2
max_line_length=79
[{.babelrc,.stylelintrc,jest.config,.eslintrc,*.bowerrc,*.jsb3,*.jsb2,*.json,*.js}]
indent_style=space
indent_size=2

2
.gitignore vendored
View File

@ -1,3 +1,5 @@
*.sw?
nimcache/
/strawboss
src/test/nim/runtests
src/test/nim/run_*_tests

150
README.md
View File

@ -8,7 +8,6 @@
* Configuration is two-part. Pipeline, step, and artifact definition are part
of the project configuration (.strawboss.json? yaml?). Environment
configuration lives on the strawboss server (supplies DB info, etc.).
* REST API?
* Step execution happens within the root directory of a fresh copy of the repo.
Commit identifiers (hash/ref/etc.) are supplied when a build is triggered and
the fresh copy is checked out at that reference.
@ -33,6 +32,20 @@ sub-directories. Each
## Configuration
There are two points of configuration when working with StrawBoss, the
[StrawBoss configuration file](#strawboss-configuration-file), and the
individual [project configurations](#project-configuration).
The [StrawBoss configuration file](#strawboss-configuration-file) is used to
configure the StrawBoss instance itself and stores server-side information such
as the list of projects known to StrawBoss. If you are setting up StrawBoss on
a server you will need to work with this configuration file.
The [project configurations](#project-configuration) are used to configure the
build process and options for each proejct and are stored with the projects
themselves. If you are working on a project that you wish to build with
StrawBoss you will be working with this configuration file.
### StrawBoss configuration file
StrawBoss expects to find `strawboss.config.json` in the working directory of
@ -40,16 +53,26 @@ the `strawboss` executable. This is the configuration file for StrawBoss
itself. The contents are expected to be a valid JSON object. The top level keys
are:
* `artifactsRepo`: A string denoting the path to the artifacts repository
directory.
* `buildDataDir`: *(optional)* A string denoting the path to the directory
where StrawBoss keeps metadata about builds it has performed and the
artifacts resulting from the builds. *(defaults to `build-data`)*
* `users`: the array of user definition objects. Each user object is required
* `authSecret`: *(required)* Secret key used to sign JWT session tokens.
* `users`: *(required)* the array of user definition objects. Each user object is required
to have `username` and `hashedPwd` keys, both string.
* `tokens`: an array of string, each representing a valid auth token that has
been issued to a client.
* `projects`: *(required)* an array of project definitions (detailed below).
* `projects`: an array of project definitions (detailed below).
* `pwdCost`: *(required)* parameter to the user password hashing algorithm determining the
computational cost of the hash.
* `maintenancePeriod`: *(optional)* how often, in milliseconds, should the
StrawBoss server perform maintenance (clear finished workers, etc).
*(defaults to `10000`, every 10 seconds)*.
* `debug`: boolean, should debug behavior be enabled. This is primarily
intended for testing during StrawBoss development. *(defaults to `false`)*
All are required.
@ -103,30 +126,6 @@ object. The top level keys are:
that is expected to print the current version of the project on `stdout`.
*(defaults to `git describe --tags --always`)*.
## Build Process
When performing a build, StrawBoss:
1. creates a temporary workspace for this build
2. clones the repo into the workspace
3. checkout the revision or branch requested for this run
4. load the project's StrawBoss configuration file.
5. merge environment variables defined in the project configuration
6. run `versionCmd` to get the current project version. The result is stored
in the `VERSION` environment variable.
7. check the environment variables against `expectedEnv`
8. check that all the steps named in `depends` have already been run and run
them if they have not. For each step named in `depends` an environment
variable is added named `<step-name>_DIR` that contains the absolute path to
the artifacts repo for that step at this version. This is intended to be
used to reference artifacts from other steps, e.g.
`${build_DIR}/site-contents.zip`.
9. `stepCmd` is executed in `workingDir`. Environment variables in `cmdInput`
are resolved and the resulting string are fed line-by-line into the process
as `stdin`.
10. the files named in `artifacts` are copied into the artifacts repo for this
step and version.
#### Step Definition
Step definitions are JSON objects with the following keys:
@ -136,7 +135,7 @@ Step definitions are JSON objects with the following keys:
`'.'`, the project root directory)*.
* `stepCmd` *(optional)*: the command to execute for this step. *(defaults to
`sh`)*
`true` unless `cmdInput` is given, in which case it defaults to `sh`)*
* `cmdInput` *(optional)*: an array of string that will be concatenated with
newlines separating each string and piped as input to the command for this
@ -167,6 +166,30 @@ Step definitions are JSON objects with the following keys:
the step. If `dontSkip` is set to `true`, the output of this step will always
be run when it is referenced, regardless of previous cached results.
## Build Process
When performing a build, StrawBoss:
1. creates a temporary workspace for this build
2. clones the repo into the workspace
3. checkout the revision or branch requested for this run
4. load the project's StrawBoss configuration file.
5. merge environment variables defined in the project configuration
6. run `versionCmd` to get the current project version. The result is stored
in the `VERSION` environment variable.
7. check the environment variables against `expectedEnv`
8. check that all the steps named in `depends` have already been run and run
them if they have not. For each step named in `depends` an environment
variable is added named `<step-name>_DIR` that contains the absolute path to
the artifacts repo for that step at this version. This is intended to be
used to reference artifacts from other steps, e.g.
`${build_DIR}/site-contents.zip`.
9. `stepCmd` is executed in `workingDir`. Environment variables in `cmdInput`
are resolved and the resulting string are fed line-by-line into the process
as `stdin`.
10. the files named in `artifacts` are copied into the artifacts repo for this
step and version.
## Architecture
The following describes the internal architecture of StrawBoss. This section is
@ -185,16 +208,19 @@ files.
##### Cached configuration files.
The cached project configuration files follow this naming convention:
`configuration.<version>.json`. StrawBoss uses the file modification time to
determine which configuration file is the most recent. These cached versions of
the project configration are only intended to be used in cases where StrawBoss
is not building anything and doesn't check out a copy of the repo. For example,
when a client queries the REST API for the list of steps in a project,
StrawBoss will consult the most recently modified cached copy of the project
configuration rather than cloning the entire repo just to answer this question.
Whenever StrawBoss has a copy of the repo, it should look for the actual
configuration file in that version of the repo instead of consulting the cached
configuration files.
`configuration.<version>.json`. These cached versions of the project
configration are only intended to be used in cases where StrawBoss is not
building anything and doesn't check out a copy of the repo. For example, when a
client queries the REST API for the list of steps in a project, StrawBoss will
consult the most recently modified cached copy of the project configuration
rather than cloning the entire repo just to answer this question. Whenever
StrawBoss has a copy of the repo, it should look for the actual configuration
file in that version of the repo instead of consulting the cached configuration
files. When determining the "most recent" cached copy, StrawBoss uses the
modification time of the files, again to avoid cloning the repo. API access to
project configurations in this manner is intended as a convenience. The actual
project configuration in the project repository should be considered the source
of truth.
##### Step and Version Directories
@ -217,3 +243,43 @@ using the handler to update the supervisor's knowledge of the build results and
When launched in single-build mode there is no supervisory process. The main
process directly executes the requested build steps.
### Building StrawBoss
To build StrawBoss locally, checkout the repository and in the repo root run:
nimble build
### Testing
StrawBoss has two test suites, a set of unit tests and a set of functional
tests. All the test code and assets live under the `src/test` subdirectory.
Each test suite has a runner file that serves as an entry for the test process,
named `run_unit_tests.nim` and `run_functional_tests.nim`.
#### Unit Tests
The unit test soruce files live in the `nim/unit` subdirectory and have a
one-to-one correspondence with the StrawBoss source files following this
naming convention: `t<module>.nim`. The unit tests are intended to be run any
time the code is recompiled.
To run the unit tests, use the `unittest` nimble task:
nimble unittest
#### Functional Tests
The functional test source files live in the `nim/functional` subdirectory.
There is a test project that is used to excercise StrawBoss functionality. To
avoid external coupling it is stored within the StrawBoss repository as a test
asset. To avoid `git` complications it is stored as a Gzipped TAR file and
unpacked to a temporary directory as part of the functional test process.
As the functional tests are more time-consuming and intensive, they are
expected to be run when performing a build.
To run the functional tests, use the `functest` nimble task:
nimble functest

17
TODO.md
View File

@ -1,6 +1,11 @@
* Write a tool to convert JSON Schema into a human-readable format suitable for
documentation. Should use the description, title, and other fields from the
JSON spec. Use this for writing the JSON schema docs instead of duplicating
the description of configuration files between JSON schema and the
documentation. In other words, use the schemas as the single source of truth
and generate everything else from that.
TODO
* Orchestration of docker containers for running builds.
* Write API docs.
NICE TO HAVE
* Use/create some json-schema -> nim code generator to auto-generate json
handling code from schemas.
* Use some json-schema -> docs generator to document the API.
* Support unique UUID prefixes in URLs.

27
api.rst
View File

@ -1,9 +1,20 @@
GET /api/ping
POST /api/auth-token
GET /api/projects -- return project summaries
POST /api/projects -- create a new project
GET /api/project/<proj-id> -- return detailed project record (include steps)
GET /api/project/<proj-id>/<step-id> -- return detailed step information (include runs)
POST /api/project/<proj-id>/<step-id>/run/<ref> -- kick off a run
GET /api/project/<proj-id>/<step-id>/run/<ref> -- return detailed run information
✓ GET /api/ping -- returns "pong"
✓ POST /api/auth-token -- create an return an auth token given {"username": "...", "password": "..."}
✓ GET /api/verify-auth -- returns 200 or 401 depend on validity of the provided auth (auth ping)
✓ GET /api/projects -- return project summaries
- POST /api/projects -- create a new project
- GET /api/project/<proj-id> -- TODO
* GET /api/project/<proj-id>/runs -- list summary information for all runs
* GET /api/project/<proj-id>/runs/active -- list summary information about all currently active runs
- GET /api/project/<proj-id>/runs/<run-id> -- list detailed information about a specific run
✓ GET /api/project/<proj-id>/versions -- list the versions of this project that have been built
* GET /api/project/<proj-id>/version/<ref> -- return detailed project definition (include steps) at a specific version
- GET /api/project/<proj-id>/step/<step-id> -- return detailed step information (include runs)
* POST /api/project/<proj-id>/step/<step-id>/run/<ref> -- kick off a run
Legend:
✓ implemented with passing tests
* implemented, needs testing
- not implemented
M missing (not even stubbed out)

18
file-structure.txt Normal file
View File

@ -0,0 +1,18 @@
build-data/
<project-name>/
configurations/
<version>.json
runs/
<id>.request.json
<id>.stdout.log
<id>.stderr.log
<id>.status.json
status/
<step-name>/
<version>.json
artifacts/
<step-name>/
<version>/
<artifact-file>
workspace/

View File

@ -3,6 +3,9 @@
"type": "object",
"properties": {
"artifactsRepo": { "type": "string" },
"authSecret": { "type": "string" },
"debug": { "type": "bool" },
"pwdCost": { "type": "integer" },
"projects": {
"title": "ProjectsList",
"type": "array",
@ -30,19 +33,14 @@
"title": "UserDefinition",
"type": "object",
"properties": {
"username": { "type": "string" },
"name": { "type": "string" },
"hashedPwd": { "type": "string" }
},
"required": ["username", "hashedPwd"],
"required": ["name", "hashedPwd"],
"additionalProperties": false
}
},
"tokens": {
"title": "TokensList",
"type": "array",
"items": { "type": "string" }
}
},
"required": ["artifactsRepo", "projects", "users", "tokens"],
"required": ["artifactsRepo", "authSecret", "pwdCost", "projects", "users"],
"additionalProperties": false
}

View File

@ -1,58 +1,58 @@
import docopt, os, sequtils, tempfile
import cliutils, docopt, os, sequtils, strutils, tempfile, uuids
import strawboss/private/util
import strawboss/configuration
import strawboss/core
import strawboss/server
let SB_VER = "0.2.0"
import strawbosspkg/configuration
import strawbosspkg/core
import strawbosspkg/server
import strawbosspkg/version
proc logProcOutput*(outMsg, errMsg: TaintedString, cmd: string) =
let prefix = if cmd != nil: cmd else: ""
if outMsg != nil: echo prefix & "(stdout): " & outMsg
if errMsg != nil: echo prefix & "(stderr): " & errMsg
let prefix = if cmd.len > 0: cmd & ": " else: ""
if outMsg.len > 0: stdout.writeLine prefix & outMsg
if errMsg.len > 0: stderr.writeLine prefix & errMsg
when isMainModule:
var cfg = loadStrawBossConfig("strawboss.config.json")
if not existsDir(cfg.artifactsRepo):
echo "Artifacts repo (" & cfg.artifactsRepo & ") does not exist. Creating..."
createDir(cfg.artifactsRepo)
cfg.artifactsRepo = expandFilename(cfg.artifactsRepo)
let doc = """
Usage:
strawboss serve
strawboss run <project> <step> [options]
strawboss serve [options]
strawboss run <requestFile> [options]
strawboss hashpwd <pwd>
strawboss api-key <username>
Options
-f --force-rebuild Force a build step to re-run even we have cached
results from building that step before for this
version of the project.
-r --reference <ref> Build the project at this commit reference.
-w --workspace <workspace> Use the given directory as the build workspace.
-c --config-file <cfgFile> Use this config file instead of the default
(strawboss.config.json).
"""
let args = docopt(doc, version = "strawboss v" & SB_VER)
let args = docopt(doc, version = "strawboss v" & SB_VERSION)
let cfgFile = if args["--config-file"]: $args["--config-file"]
else: "strawboss.config.json"
var cfg = loadStrawBossConfig(cfgFile)
cfg.pathToExe = paramStr(0)
if not existsDir(cfg.buildDataDir):
echo "Build data directory (" & cfg.buildDataDir & ") does not exist. Creating..."
createDir(cfg.buildDataDir)
cfg.buildDataDir = expandFilename(cfg.buildDataDir)
echo $args
if args["run"]:
let req = RunRequest(
projectName: $args["<project>"],
stepName: $args["<step>"],
buildRef: if args["--reference"]: $args["--reference"] else: nil,
forceRebuild: args["--force-rebuild"],
workspaceDir: if args["--workspace"]: $args["<workspace>"] else: mkdtemp())
var req: RunRequest
try: req = loadRunRequest($args["<requestFile>"])
except:
echo "strawboss: unable to parse run request (" & $args["<requestFile>"] & ")"
quit(QuitFailure)
try:
let status = core.runStep(cfg, req, logProcOutput)
if status.state == "failed": raiseEx status.details
if req.workspaceDir.len == 0: req.workspaceDir = mkdtemp()
let status = core.run(cfg, req, logProcOutput)
if status.state == BuildState.failed: raiseEx status.details
echo "strawboss: build passed."
except:
echo "strawboss: build FAILED: " & getCurrentExceptionMsg() & "."
@ -62,3 +62,12 @@ Options
elif args["serve"]: server.start(cfg)
elif args["hashpwd"]:
echo $cfg.pwdCost
let pwd = server.hashPwd($args["<pwd>"], cfg.pwdCost)
echo pwd
echo pwd[0..28]
elif args["api-key"]:
let sessionToken = server.makeApiKey(cfg, $args["<username>"])
echo sessionToken

View File

@ -1,133 +0,0 @@
import logging, json, os, nre, sequtils, strtabs, tables
import private/util
# Types
#
type
BuildStatus* = object
state*, details*: string
Step* = object
name*, stepCmd*, workingDir*: string
artifacts*, cmdInput*, depends*, expectedEnv*: seq[string]
dontSkip*: bool
ProjectCfg* = object
name*: string
versionCmd*: string
steps*: Table[string, Step]
ProjectDef* = object
cfgFilePath*, defaultBranch*, name*, repo*: string
envVars*: StringTableRef
StrawBossConfig* = object
artifactsRepo*: string
projects*: seq[ProjectDef]
RunRequest* = object
projectName*, stepName*, buildRef*, workspaceDir*: string
forceRebuild*: bool
# internal utils
let nullNode = newJNull()
proc getIfExists(n: JsonNode, key: string): JsonNode =
result = if n.hasKey(key): n[key]
else: nullNode
proc getOrFail(n: JsonNode, key: string, objName: string = ""): JsonNode =
if not n.hasKey(key): raiseEx objName & " missing key " & key
return n[key]
# Configuration parsing code
proc loadStrawBossConfig*(cfgFile: string): StrawBossConfig =
if not existsFile(cfgFile):
raiseEx "strawboss config file not found: " & cfgFile
let jsonCfg = parseFile(cfgFile)
var projectDefs: seq[ProjectDef] = @[]
for pJson in jsonCfg.getIfExists("projects").getElems:
var envVars = newStringTable(modeCaseSensitive)
for k, v in pJson.getIfExists("envVars").getFields: envVars[k] = v.getStr("")
projectDefs.add(
ProjectDef(
cfgFilePath: pJson.getIfExists("cfgFilePath").getStr("strawboss.json"),
defaultBranch: pJson.getIfExists("defaultBranch").getStr("master"),
name: pJson.getOrFail("name", "project definition").getStr,
envVars: envVars,
repo: pJson.getOrFail("repo", "project definition").getStr))
result = StrawBossConfig(
artifactsRepo: jsonCfg.getIfExists("artifactsRepo").getStr("artifacts"),
projects: projectDefs)
proc loadProjectConfig*(cfgFile: string): ProjectCfg =
if not existsFile(cfgFile):
raiseEx "project config file not found: " & cfgFile
let jsonCfg = parseFile(cfgFile)
if not jsonCfg.hasKey("steps"):
raiseEx "project configuration is missing steps definition"
var steps = initTable[string, Step]()
for sName, pJson in jsonCfg.getOrFail("steps", "project configuration").getFields:
steps[sName] = Step(
name: sName,
workingDir: pJson.getIfExists("workingDir").getStr("."),
stepCmd: pJson.getIfExists("stepCmd").getStr("sh"),
depends: pJson.getIfExists("depends").getElems.mapIt(it.getStr),
artifacts: pJson.getIfExists("artifacts").getElems.mapIt(it.getStr),
cmdInput: pJson.getIfExists("cmdInput").getElems.mapIt(it.getStr),
expectedEnv: pJson.getIfExists("expectedEnv").getElems.mapIt(it.getStr),
dontSkip: pJson.getIfExists("dontSkip").getStr("false") != "false")
if steps[sName].stepCmd == "sh" and steps[sName].cmdInput.len == 0:
warn "Step " & sName & " uses 'sh' as its command but has no cmdInput."
result = ProjectCfg(
name: jsonCfg.getOrFail("name", "project configuration").getStr,
versionCmd: jsonCfg.getIfExists("versionCmd").getStr("git describe --tags --always"),
steps: steps)
proc loadBuildStatus*(statusFile: string): BuildStatus =
if not existsFile(statusFile): raiseEx "status file not found: " & statusFile
let jsonObj = parseFile(statusFile)
result = BuildStatus(
state: jsonObj.getOrFail("state", "build status").getStr,
details: jsonObj.getIfExists("details").getStr("") )
proc parseRunRequest*(reqStr: string): RunRequest =
let reqJson = parseJson(reqStr)
result = RunRequest(
projectName: reqJson.getOrFail("projectName", "RunRequest").getStr,
stepName: reqJson.getOrFail("stepName", "RunRequest").getStr,
buildRef: reqJson.getOrFail("buildRef", "RunRequest").getStr,
workspaceDir: reqJson.getOrFail("workspaceDir", "RunRequest").getStr,
forceRebuild: reqJson.getOrFail("forceRebuild", "RunRequest").getBVal)
# TODO: can we use the marshal module for this?
proc `%`*(s: BuildStatus): JsonNode =
result = %* {
"state": s.state,
"details": s.details
}
proc `%`*(req: RunRequest): JsonNode =
result = %* {
"projectName": req.projectName,
"stepName": req.stepName,
"buildRef": req.buildRef,
"workspaceDir": req.workspaceDir,
"forceRebuild": req.forceRebuild
}
proc `$`*(s: BuildStatus): string = result = pretty(%s)
proc `$`*(req: RunRequest): string = result = pretty(%req)

View File

@ -1,270 +0,0 @@
import logging, nre, os, osproc, sequtils, streams, strtabs, strutils, tables, tempfile
import private/util
import configuration
from posix import link
type
Workspace = ref object ## Data needed by internal build process
artifactsDir*: string ## absolute path to the directory for this version
artifactsRepo*: string ## absolute path to the global artifacts repo
buildRef*: string ## git-style commit reference to the revision we are building
dir*: string ## absolute path to the working directory
env*: StringTableRef ## environment variables for all build processes
openedFiles*: seq[File] ## all files that we have opened that need to be closed
outputHandler*: HandleProcMsgCB ## handler for process output
project*: ProjectCfg ## the project configuration
projectDef*: ProjectDef ## the StrawBoss project definition
status*: BuildStatus ## the current status of the build
statusFile*: string ## absolute path to the build status file
step*: Step ## the step we're building
version*: string ## project version as returned by versionCmd
proc resolveEnvVars(line: string, env: StringTableRef): string =
result = line
for found in line.findAll(re"\$\w+|\$\{[^}]+\}"):
let key = if found[1] == '{': found[2..^2] else: found[1..^1]
if env.hasKey(key): result = result.replace(found, env[key])
proc emitStatus(status: BuildStatus, statusFilePath: string,
outputHandler: HandleProcMsgCB): BuildStatus =
if statusFilePath != nil: writeFile(statusFilePath, $status)
if outputHandler != nil:
outputHandler.sendMsg(status.state & ": " & status.details)
result = status
proc publishStatus(wksp: Workspace, state, details: string) =
let status = BuildStatus(state: state, details: details)
wksp.status = emitStatus(status, wksp.statusFile, wksp.outputHandler)
proc setupProject(wksp: Workspace) =
# Clone the project into the $temp/repo directory
let cloneResult = exec("git", wksp.dir,
["clone", wksp.projectDef.repo, "repo"],
wksp.env, {poUsePath}, wksp.outputHandler)
if cloneResult.exitCode != 0:
raiseEx "unable to clone repo for '" & wksp.projectDef.name & "'"
# Checkout the requested ref
let checkoutResult = exec("git", wksp.dir & "/repo",
["checkout", wksp.buildRef],
wksp.env, {poUsePath}, wksp.outputHandler)
if checkoutResult.exitCode != 0:
raiseEx "unable to checkout ref " & wksp.buildRef &
" for '" & wksp.projectDef.name & "'"
# Find the strawboss project configuration
let projCfgFile = wksp.dir & "/repo/" & wksp.projectDef.cfgFilePath
if not existsFile(projCfgFile):
raiseEx "Cannot find strawboss project configuration in the project " &
"repo (expected at '" & wksp.projectDef.cfgFilePath & "')."
wksp.project = loadProjectConfig(projCfgFile)
# Merge in the project-defined env vars
for k, v in wksp.projectDef.envVars: wksp.env[k] = v
# Get the build version
let versionProc = startProcess(
wksp.project.versionCmd, # command
wksp.dir & "/repo", # working dir
[], # args
wksp.env, # environment
{poUsePath, poEvalCommand}) # options
let versionResult = waitForWithOutput(versionProc, wksp.outputHandler,
wksp.project.versionCmd)
if versionResult.exitCode != 0:
raiseEx "Version command (" & wksp.project.versionCmd &
") returned non-zero exit code."
wksp.outputHandler.sendMsg "Building version " & versionResult.output.strip
wksp.version = versionResult.output.strip
wksp.env["VERSION"] = wksp.version
proc runStep*(wksp: Workspace, step: Step) =
let SB_EXPECTED_VARS = ["VERSION"]
wksp.publishStatus("running",
"running '" & step.name & "' for version " & wksp.version &
" from " & wksp.buildRef)
# Ensure all expected environment variables are present.
for k in (step.expectedEnv & @SB_EXPECTED_VARS):
if not wksp.env.hasKey(k):
raiseEx "step " & step.name & " failed: missing required env variable: " & k
# Ensure that artifacts in steps we depend on are present
# TODO: detect circular-references in dependency trees.
for dep in step.depends:
if not wksp.project.steps.hasKey(dep):
raiseEx step.name & " depends on " & dep &
" but there is no step named " & dep
let depStep = wksp.project.steps[dep]
# Run that step (may get skipped)
runStep(wksp, depStep)
# Add the artifacts directory for the dependent step to our env so that
# further steps can reference it via $<stepname>_DIR
wksp.env[depStep.name & "_DIR"] = wksp.artifactsRepo & "/" &
wksp.project.name & "/" & dep & "/" & wksp.version
# Run the step command, piping in cmdInput
wksp.outputHandler.sendMsg step.name & ": starting stepCmd: " & step.stepCmd
let cmdProc = startProcess(step.stepCmd,
wksp.dir & "/repo/" & step.workingDir,
[], wksp.env, {poUsePath, poEvalCommand})
let cmdInStream = inputStream(cmdProc)
# Replace env variables in step cmdInput as we pipe it in
for line in step.cmdInput: cmdInStream.writeLine(line.resolveEnvVars(wksp.env))
cmdInStream.flush()
cmdInStream.close()
let cmdResult = waitForWithOutput(cmdProc, wksp.outputHandler, step.stepCmd)
if cmdResult.exitCode != 0:
raiseEx "step " & step.name & " failed: step command returned non-zero exit code"
# Gather the output artifacts (if we have any)
wksp.outputHandler.sendMsg "artifacts: " & $step.artifacts
if step.artifacts.len > 0:
for a in step.artifacts:
let artifactPath = a.resolveEnvVars(wksp.env)
let artifactName = artifactPath[(artifactPath.rfind("/")+1)..^1]
try:
wksp.outputHandler.sendMsg "copy " & wksp.dir & "/repo/" & step.workingDir & "/" & artifactPath & " -> " & wksp.artifactsDir & "/" & artifactName
copyFile(wksp.dir & "/repo/" & step.workingDir & "/" & artifactPath,
wksp.artifactsDir & "/" & artifactName)
except:
raiseEx "step " & step.name & " failed: unable to copy artifact " &
artifactPath & ":\n" & getCurrentExceptionMsg()
wksp.publishStatus("complete", "")
proc runStep*(cfg: StrawBossConfig, req: RunRequest,
outputHandler: HandleProcMsgCB = nil): BuildStatus =
result = BuildStatus(
state: "setup",
details: "initializing build workspace")
discard emitStatus(result, nil, outputHandler)
var wksp: Workspace
try:
assert req.workspaceDir.isAbsolute
if not existsDir(req.workspaceDir): createDir(req.workspaceDir)
# Find the project definition
let matching = cfg.projects.filterIt(it.name == req.projectName)
if matching.len == 0: raiseEx "no such project: " & req.projectName
elif matching.len > 1: raiseEx "more than one project named : " & req.projectName
# Read in the existing system environment
var env = loadEnv()
env["GIT_DIR"] = ".git"
# Setup our STDOUT and STDERR files
let stdoutFile = open(req.workspaceDir & "/stdout.log", fmWrite)
let stderrFile = open(req.workspaceDir & "/stderr.log", fmWrite)
let logFilesOH = makeProcMsgHandler(stdoutFile, stderrFile)
wksp = Workspace(
artifactsDir: nil,
artifactsRepo: cfg.artifactsRepo,
buildRef:
if req.buildRef != nil and req.buildRef.len > 0: req.buildRef
else: matching[0].defaultBranch,
dir: req.workspaceDir,
env: env,
openedFiles: @[stdoutFile, stderrFile],
outputHandler: combineProcMsgHandlers(outputHandler, logFilesOH),
project: ProjectCfg(),
projectDef: matching[0],
status: result,
statusFile: req.workspaceDir & "/" & "status.json",
step: Step(),
version: nil)
except:
result = BuildStatus(state: "failed",
details: getCurrentExceptionMsg())
try: discard emitStatus(result, nil, outputHandler)
except: discard ""
try:
# Clone the repo and setup the working environment
wksp.publishStatus("setup",
"cloning project repo and preparing to run '" & req.stepName & "'")
wksp.setupProject()
# Find the requested step
if not wksp.project.steps.hasKey(req.stepName):
raiseEx "no step name '" & req.stepName & "' for " & req.projectName
var step = wksp.project.steps[req.stepName]
# Enfore forceRebuild
if req.forceRebuild: step.dontSkip = true
# Compose the path to the artifacts directory for this step and version
wksp.artifactsDir = wksp.artifactsRepo & "/" & wksp.project.name & "/" &
step.name & "/" & wksp.version
# Have we tried to build this before and are we caching the results?
if existsFile(wksp.artifactsDir & "/status.json") and not step.dontSkip:
let prevStatus = loadBuildStatus(wksp.artifactsDir & "/status.json")
# If we succeeded last time, no need to rebuild
if prevStatus.state == "complete":
wksp.outputHandler.sendMsg(
"Skipping step '" & step.name & "' for version '" &
wksp.version & "': already completed.")
return prevStatus
else:
wksp.outputHandler.sendMsg(
"Rebuilding failed step '" & step.name & "' for version '" &
wksp.version & "'.")
# Make the artifacts directory if it doesn't already exist
if not existsDir(wksp.artifactsDir): createDir(wksp.artifactsDir)
# Link status file and output logs to the artifacts dir
for fn in @["status.json", "stdout.log", "stderr.log"]:
# TODO: roll old files instead of delete them?
if existsFile(wksp.artifactsDir & "/" & fn):
removeFile(wksp.artifactsDir & "/" & fn)
if link(wksp.dir & "/" & fn, wksp.artifactsDir & "/" & fn) != 0:
wksp.outputHandler.sendMsg(nil,
"WARN: could not link " & fn & " to artifacts dir.")
runStep(wksp, step)
result = wksp.status
except:
let msg = getCurrentExceptionMsg()
try:
wksp.publishStatus("failed", msg)
result = wksp.status
except:
result = BuildStatus(state: "failed", details: msg)
try: discard emitStatus(result, nil, outputHandler)
except: discard ""
finally:
if wksp != nil:
for f in wksp.openedFiles:
try: close(f)
except: discard ""

View File

@ -1,78 +0,0 @@
import os, osproc, streams, strtabs
from posix import kill
type HandleProcMsgCB* = proc (outMsg: TaintedString, errMsg: TaintedString, cmd: string): void
proc sendMsg*(h: HandleProcMsgCB, outMsg: TaintedString, errMsg: TaintedString = nil, cmd: string = "strawboss"): void =
if h != nil: h(outMsg, errMsg, cmd)
proc raiseEx*(reason: string): void =
raise newException(Exception, reason)
proc envToTable*(): StringTableRef =
result = newStringTable()
for k, v in envPairs():
result[k] = v
proc waitForWithOutput*(p: Process, msgCB: HandleProcMsgCB,
procCmd: string = ""):
tuple[output: TaintedString, error: TaintedString, exitCode: int] =
var pout = outputStream(p)
var perr = errorStream(p)
result = (TaintedString"", TaintedString"", -1)
var line = newStringOfCap(120).TaintedString
while true:
if pout.readLine(line):
msgCB.sendMsg(line, nil, procCmd)
result[0].string.add(line.string)
result[0].string.add("\n")
elif perr.readLine(line):
msgCB.sendMsg(nil, line, procCmd)
result[1].string.add(line.string)
result[1].string.add("\n")
else:
result[2] = peekExitCode(p)
if result[2] != -1: break
close(p)
proc exec*(command: string, workingDir: string = "",
args: openArray[string] = [], env: StringTableRef = nil,
options: set[ProcessOption] = {poUsePath},
msgCB: HandleProcMsgCB = nil):
tuple[output: TaintedString, error: TaintedString, exitCode: int]
{.tags: [ExecIOEffect, ReadIOEffect], gcsafe.} =
var p = startProcess(command, workingDir, args, env, options)
result = waitForWithOutput(p, msgCb, command)
proc loadEnv*(): StringTableRef =
result = newStringTable()
for k, v in envPairs():
result[k] = v
proc makeProcMsgHandler*(outSink, errSink: File): HandleProcMsgCB =
result = proc(outMsg, errMsg: TaintedString, cmd: string): void {.closure.} =
let prefix = if cmd != nil: cmd & ": " else: ""
if outMsg != nil: outSink.writeLine(prefix & outMsg)
if errMsg != nil: errSink.writeLine(prefix & errMsg)
proc makeProcMsgHandler*(outSink, errSink: Stream): HandleProcMsgCB =
result = proc(outMsg, errMsg: TaintedString, cmd: string): void {.closure.} =
let prefix = if cmd != nil: cmd & ": " else: ""
if outMsg != nil: outSink.writeLine(prefix & outMsg)
if errMsg != nil: errSink.writeLine(prefix & errMsg)
proc combineProcMsgHandlers*(a, b: HandleProcMsgCB): HandleProcMsgCB =
if a == nil: result = b
elif b == nil: result = a
else:
result = proc(cmd: string, outMsg, errMsg: TaintedString): void =
a(cmd, outMsg, errMsg)
b(cmd, outMsg, errMsg)

View File

@ -1,38 +0,0 @@
import asyncdispatch, jester, json, osproc, tempfile
import ./configuration, ./core, private/util
settings:
port = Port(8180)
type Worker = object
process*: Process
workingDir*: string
proc spawnWorker(req: RunRequest): Worker =
let dir = mkdtemp()
var args = @["run", req.projectName, req.stepName, "-r", req.buildRef, "-w", dir]
if req.forceRebuild: args.add("-f")
result = Worker(
process: startProcess("strawboss", ".", args, loadEnv(), {poUsePath}),
workingDir: dir)
proc start*(givenCfg: StrawBossConfig): void =
var workers: seq[Worker] = @[]
routes:
get "/api/ping":
resp $(%*"pong"), "application/json"
get "/api/projects":
resp $(%*[]), "application/json"
post "/api/project/@projectName/@stepName/run/@buildRef?":
workers.add(spawnWorker(RunRequest(
projectName: @"projectName",
stepName: @"stepName",
buildRef: if @"buildRef" != "": @"buildRef" else: nil,
forceRebuild: false))) # TODO support this with optional query params
runForever()

View File

@ -0,0 +1,326 @@
import cliutils, logging, json, os, sequtils, strtabs, strutils, tables, times,
unicode, uuids
from langutils import sameContents
from typeinfo import toAny
from strutils import parseEnum
const ISO_TIME_FORMAT = "yyyy-MM-dd'T'HH:mm:sszzz"
# Types
type
BuildState* {.pure.} = enum
complete, failed, queued, rejected, running, setup, stepComplete
BuildStatus* = object
runId*, details*, version*: string
state*: BuildState
Step* = object
containerImage*, name*, stepCmd*, workingDir*: string
artifacts*, cmdInput*, depends*, expectedEnv*: seq[string]
dontSkip*: bool
ProjectConfig* = object
containerImage*, name*, versionCmd*: string
steps*: Table[string, Step]
ProjectDef* = object
cfgFilePath*, defaultBranch*, name*, repo*: string
envVars*: StringTableRef
RunRequest* = object
runId*: UUID
projectName*, stepName*, buildRef*, workspaceDir*: string
timestamp*: DateTime
forceRebuild*: bool
Run* = object
id*: UUID
request*: RunRequest
status*: BuildStatus
RunLogs* = object
runId*: UUID
stdout*, stderr*: seq[string]
User* = object
name*: string
hashedPwd*: string
UserRef* = ref User
StrawBossConfig* = object
buildDataDir*: string
authSecret*: string
filePath*: string
debug*: bool
logLevel*: Level
pathToExe*: string
port*: int
projects*: seq[ProjectDef]
pwdCost*: int8
users*: seq[UserRef]
maintenancePeriod*: int
# Equality on custom types
proc `==`*(a, b: UserRef): bool = result = a.name == b.name
proc `==`*(a, b: ProjectDef): bool =
if a.envVars.len != b.envVars.len: return false
for k, v in a.envVars:
if not b.envVars.hasKey(k) or a.envVars[k] != b.envVars[k]: return false
return
a.name == b.name and
a.cfgFilePath == b.cfgFilePath and
a.defaultBranch == b.defaultBranch and
a.repo == b.repo
proc `==`*(a, b: StrawBossConfig): bool =
result =
a.buildDataDir == b.buildDataDir and
a.authSecret == b.authSecret and
a.pwdCost == b.pwdCost and
a.port == b.port and
a.maintenancePeriod == b.maintenancePeriod and
a.logLevel == b.logLevel and
sameContents(a.users, b.users) and
sameContents(a.projects, b.projects)
proc `==`*(a, b: RunRequest): bool =
result =
a.runId == b.runId and
a.projectName == b.projectName and
a.stepName == b.stepName and
a.buildRef == b.buildRef and
a.timestamp == b.timestamp and
a.workspaceDir == b.workspaceDir and
a.forceRebuild == b.forceRebuild
# Useful utilities
proc filesMatching*(pat: string): seq[string] = toSeq(walkFiles(pat))
proc raiseEx*(reason: string): void =
raise newException(Exception, reason)
# internal utils
proc getIfExists(n: JsonNode, key: string): JsonNode =
## convenience method to get a key from a JObject or return null
result = if n.hasKey(key): n[key]
else: newJNull()
proc getOrFail(n: JsonNode, key: string, objName: string = ""): JsonNode =
## convenience method to get a key from a JObject or raise an exception
if not n.hasKey(key): raiseEx objName & " missing key '" & key & "'"
return n[key]
# Configuration parsing code
proc parseLogLevel*(level: string): Level =
let lvlStr = "lvl" & toUpperAscii(level[0]) & level[1..^1]
result = parseEnum[Level](lvlStr)
proc parseProjectDef*(pJson: JsonNode): ProjectDef =
var envVars = newStringTable(modeCaseSensitive)
for k, v in pJson.getIfExists("envVars").getFields: envVars[k] = v.getStr("")
result = ProjectDef(
cfgFilePath: pJson.getIfExists("cfgFilePath").getStr("strawboss.json"),
defaultBranch: pJson.getIfExists("defaultBranch").getStr("master"),
name: pJson.getOrFail("name", "project definition").getStr,
envVars: envVars,
repo: pJson.getOrFail("repo", "project definition").getStr)
proc parseStrawBossConfig*(jsonCfg: JsonNode): StrawBossConfig =
var users: seq[UserRef] = @[]
for uJson in jsonCfg.getIfExists("users").getElems:
users.add(UserRef(
name: uJson.getOrFail("name", "user record").getStr,
hashedPwd: uJson.getOrFail("hashedPwd", "user record").getStr))
result = StrawBossConfig(
buildDataDir: jsonCfg.getIfExists("buildDataDir").getStr("build-data"),
authSecret: jsonCfg.getOrFail("authSecret", "strawboss config").getStr,
debug: jsonCfg.getIfExists("debug").getBool(false),
port: int(jsonCfg.getIfExists("port").getInt(8180)),
pwdCost: int8(jsonCfg.getOrFail("pwdCost", "strawboss config").getInt),
projects: jsonCfg.getIfExists("projects").getElems.mapIt(parseProjectDef(it)),
maintenancePeriod: int(jsonCfg.getIfExists("maintenancePeriod").getInt(10000)),
logLevel: parseLogLevel(jsonCfg.getIfExists("logLevel").getStr("info")),
users: users)
proc loadStrawBossConfig*(cfgFile: string): StrawBossConfig =
if not existsFile(cfgFile):
raiseEx "strawboss config file not found: " & cfgFile
result = parseStrawBossConfig(parseFile(cfgFile))
result.filePath = cfgFile
proc loadProjectConfig*(cfgFile: string): ProjectConfig =
if not existsFile(cfgFile):
raiseEx "project config file not found: " & cfgFile
let jsonCfg = parseFile(cfgFile)
if not jsonCfg.hasKey("steps"):
raiseEx "project configuration is missing steps definition"
var steps = initTable[string, Step]()
for sName, pJson in jsonCfg.getOrFail("steps", "project configuration").getFields:
steps[sName] = Step(
name: sName,
workingDir: pJson.getIfExists("workingDir").getStr("."),
stepCmd: pJson.getIfExists("stepCmd").getStr("NOT GIVEN"),
depends: pJson.getIfExists("depends").getElems.mapIt(it.getStr),
artifacts: pJson.getIfExists("artifacts").getElems.mapIt(it.getStr),
cmdInput: pJson.getIfExists("cmdInput").getElems.mapIt(it.getStr),
expectedEnv: pJson.getIfExists("expectedEnv").getElems.mapIt(it.getStr),
containerImage: pJson.getIfExists("containerImage").getStr(""),
dontSkip: pJson.getIfExists("dontSkip").getBool(false))
# cmdInput and stepCmd are related, so we have a conditional defaulting.
# Four possibilities:
if steps[sName].stepCmd == "NOT GIVEN" and steps[sName].cmdInput.len == 0:
# 1. Neither given: default to no-op
steps[sName].stepCmd = "true"
if steps[sName].stepCmd == "NOT GIVEN" and steps[sName].cmdInput.len > 0:
# 2. cmdInput given but not stepCmd: default stepCmd to "sh"
steps[sName].stepCmd = "sh"
# 3. stepCmd given but not cmdInput & 4. both given: use them as-is
result = ProjectConfig(
name: jsonCfg.getOrFail("name", "project configuration").getStr,
containerImage: jsonCfg.getIfExists("containerImage").getStr(""),
versionCmd: jsonCfg.getIfExists("versionCmd").getStr("git describe --tags --always"),
steps: steps)
proc parseBuildStatus*(statusJson: JsonNode): BuildStatus =
result = BuildStatus(
runId: statusJson.getOrFail("runId", "run ID").getStr,
state: parseEnum[BuildState](statusJson.getOrFail("state", "build status").getStr),
details: statusJson.getIfExists("details").getStr("") )
proc loadBuildStatus*(statusFile: string): BuildStatus =
if not existsFile(statusFile): raiseEx "status file not found: " & statusFile
let jsonObj = parseFile(statusFile)
result = parseBuildStatus(jsonObj)
proc parseRunRequest*(reqJson: JsonNode): RunRequest =
result = RunRequest(
runId: parseUUID(reqJson.getOrFail("runId", "RunRequest").getStr),
projectName: reqJson.getOrFail("projectName", "RunRequest").getStr,
stepName: reqJson.getOrFail("stepName", "RunRequest").getStr,
buildRef: reqJson.getOrFail("buildRef", "RunRequest").getStr,
workspaceDir: reqJson.getOrFail("workspaceDir", "RunRequest").getStr,
timestamp: times.parse(reqJson.getOrFail("timestamp", "RunRequest").getStr, ISO_TIME_FORMAT),
forceRebuild: reqJson.getOrFail("forceRebuild", "RunRequest").getBool)
proc loadRunRequest*(reqFilePath: string): RunRequest =
if not existsFile(reqFilePath):
raiseEx "request file not found: " & reqFilePath
parseRunRequest(parseFile(reqFilePath))
proc parseRun*(runJson: JsonNode): Run =
result = Run(
id: parseUUID(runJson.getOrFail("id", "Run").getStr),
request: parseRunRequest(runJson.getOrFail("request", "Run")),
status: parseBuildStatus(runJson.getOrFail("status", "Run")))
# TODO: can we use the marshal module for this?
proc `%`*(s: BuildStatus): JsonNode =
result = %* {
"runId": s.runId,
"state": $s.state,
"details": s.details }
proc `%`*(p: ProjectDef): JsonNode =
result = %* {
"name": p.name,
"cfgFilePath": p.cfgFilePath,
"defaultBranch": p.defaultBranch,
"repo": p.repo }
result["envVars"] = newJObject()
for k, v in p.envVars: result["envVars"][k] = %v
proc `%`*(s: Step): JsonNode =
result = %* {
"name": s.name,
"stepCmd": s.stepCmd,
"workingDir": s.workingDir,
"artifacts": s.artifacts,
"cmdInput": s.cmdInput,
"depends": s.depends,
"expectedEnv": s.expectedEnv,
"dontSkip": s.dontSkip }
if s.containerImage.len > 0:
result["containerImage"] = %s.containerImage
proc `%`*(p: ProjectConfig): JsonNode =
result = %* {
"name": p.name,
"versionCmd": p.versionCmd }
result["steps"] = newJObject()
for name, step in p.steps:
result["steps"][name] = %step
if p.containerImage.len > 0:
result["containerImage"] = %p.containerImage
proc `%`*(req: RunRequest): JsonNode =
result = %* {
"runId": $(req.runId),
"projectName": req.projectName,
"stepName": req.stepName,
"buildRef": req.buildRef,
"workspaceDir": req.workspaceDir,
"forceRebuild": req.forceRebuild,
"timestamp": req.timestamp.format(ISO_TIME_FORMAT) }
proc `%`*(user: User): JsonNode =
result = %* {
"name": user.name,
"hashedPwd": user.hashedPwd }
proc `%`*(cfg: StrawBossConfig): JsonNode =
result = %* {
"buildDataDir": cfg.buildDataDir,
"authSecret": cfg.authSecret,
"debug": cfg.debug,
"port": cfg.port,
"projects": %cfg.projects,
"pwdCost": cfg.pwdCost,
"maintenancePeriod": cfg.maintenancePeriod,
"logLevel": toLowerAscii(($cfg.logLevel)[3]) & ($cfg.logLevel)[4..^1],
"users": %cfg.users }
proc `%`*(run: Run): JsonNode =
result = %* {
"id": $run.id,
"request": %run.request,
"status": %run.status }
proc `%`*(logs: RunLogs): JsonNode =
result = %* {
"runId": $logs.runId,
"stdout": %logs.stdout,
"stderr": %logs.stderr }
proc `$`*(s: BuildStatus): string = result = pretty(%s)
proc `$`*(req: RunRequest): string = result = pretty(%req)
proc `$`*(pd: ProjectDef): string = result = pretty(%pd)
proc `$`*(cfg: StrawBossConfig): string = result = pretty(%cfg)
proc `$`*(run: Run): string = result = pretty(%run)
proc `$`*(logs: RunLogs): string = result = pretty(%logs)

View File

@ -0,0 +1,618 @@
import cliutils, logging, json, os, ospaths, osproc, sequtils, streams,
strtabs, strutils, tables, tempfile, times, uuids
import ./configuration
import nre except toSeq
from posix import link, realpath
from algorithm import sorted
type
Workspace = ref object ## Data needed by internal build process
buildDataDir*: string ## absolute path to the global build data directory for this project
buildRef*: string ## git-style commit reference to the revision we are building
dir*: string ## absolute path to the working directory
env*: StringTableRef ## environment variables for all build processes
logLevel*: Level ## log level for output messages
openedFiles*: seq[File] ## all files that we have opened that need to be closed
outputHandler*: HandleProcMsgCB ## handler for process output
project*: ProjectConfig ## the project configuration
projectDef*: ProjectDef ## the StrawBoss project definition
runRequest*: RunRequest ## the RunRequest that initated the current build
status*: BuildStatus ## the current status of the build
step*: Step ## the step we're building
version*: string ## project version as returned by versionCmd
Worker* = object
runId*: UUID
projectName*: string
process*: Process
NotFoundException* = object of Exception
proc newCopy(w: Workspace): Workspace =
var newEnv: StringTableRef = newStringTable()
newEnv[] = w.env[]
result = Workspace(
buildDataDir: w.buildDataDir,
buildRef: w.buildRef,
dir: w.dir,
env: newEnv,
logLevel: w.logLevel,
# workspaces are only responsible for files they have actually openend
openedFiles: @[],
outputHandler: w.outputHandler,
project: w.project,
projectDef: w.projectDef,
runRequest: w.runRequest,
status: w.status,
step: w.step,
version: w.version)
const WKSP_ROOT = "/strawboss/wksp"
const ARTIFACTS_ROOT = "/strawboss/artifacts"
proc execWithOutput(wksp: Workspace, cmd, workingDir: string,
args: openarray[string], env: StringTableRef,
options: set[ProcessOption] = {poUsePath},
msgCB: HandleProcMsgCB = nil):
tuple[output: TaintedString, error: TaintedString, exitCode: int]
{.tags: [ExecIOEffect, ReadIOEffect, RootEffect] .} =
# Look for a container image to use
let containerImage =
if wksp.step.containerImage.len > 0: wksp.step.containerImage
else: wksp.project.containerImage
if containerImage.len == 0:
return execWithOutput(cmd, workingDir, args, env, options, msgCB)
var fullEnv = newStringTable(modeCaseSensitive)
for k,v in env: fullEnv[k] = v
var fullArgs = @["run", "-w", WKSP_ROOT, "-v", wksp.dir & ":" & WKSP_ROOT ]
if wksp.step.name.len == 0:
for depStep in wksp.step.depends:
fullArgs.add(["-v", ARTIFACTS_ROOT / depStep])
fullEnv[depStep & "_DIR"] = ARTIFACTS_ROOT / depStep
let envFile = mkstemp().name
writeFile(envFile, toSeq(fullEnv.pairs()).mapIt(it[0] & "=" & it[1]).join("\n"))
fullArgs.add(["--env-file", envFile])
fullArgs.add(containerImage)
fullArgs.add(cmd)
echo "Executing docker command: \n\t" & "docker " & $(fullArgs & @args)
return execWithOutput("docker", wksp.dir, fullArgs & @args, fullEnv, options, msgCB)
proc exec(w: Workspace, cmd, workingDir: string, args: openarray[string],
env: StringTableRef, options: set[ProcessOption] = {poUsePath},
msgCB: HandleProcMsgCB = nil): int
{.tags: [ExecIOEffect, ReadIOEffect, RootEffect] .} =
return execWithOutput(w, cmd, workingDir, args, env, options, msgCB)[2]
# Utility methods for Workspace activities
proc sendStatusMsg(oh: HandleProcMsgCB, status: BuildStatus): void =
if not oh.isNil:
oh.sendMsg($status.state & ": " & status.details, "", "strawboss")
proc sendMsg(w: Workspace, msg: TaintedString): void =
w.outputHandler.sendMsg(msg, "", "strawboss")
proc sendMsg(w: Workspace, l: Level, msg: TaintedString): void =
if l >= w.logLevel: w.sendMsg(msg)
proc sendErrMsg(w: Workspace, msg: TaintedString): void =
w.outputHandler.sendMsg("", msg, "strawboss")
proc sendErrMsg(w: Workspace, l: Level, msg: TaintedString): void =
if l >= w.logLevel: w.sendErrMsg(msg)
proc resolveEnvVars(wksp: Workspace, line: string): string =
result = line
for found in line.findAll(re"\$\w+|\$\{[^}]+\}"):
let key = if found[1] == '{': found[2..^2] else: found[1..^1]
if wksp.env.hasKey(key): result = result.replace(found, wksp.env[key])
wksp.sendMsg(lvlDebug, "Variable substitution: \n\t" & line &
"\n\t" & result)
proc publishStatus(wksp: Workspace, state: BuildState, details: string): void =
## Update the status for a Workspace and publish this status to the
## Workspace's status file and any output message handlers.
wksp.status = BuildStatus(
runId: $wksp.runRequest.runId,
state: state,
details: details,
version: wksp.version)
# Write to our run directory, and to our version status
writeFile(wksp.buildDataDir / "runs" /
$wksp.runRequest.runId & ".status.json", $wksp.status)
# If we have our step we can save status to the step status
if wksp.step.name.len > 0:
let stepStatusDir = wksp.buildDataDir / "status" / wksp.step.name
if not existsDir(stepStatusDir): createDir(stepStatusDir)
writeFile(stepStatusDir / wksp.version & ".json", $wksp.status)
# If we were asked to build a ref that is not the version directly (like
# "master" or something), then let's also save our status under that name.
# We're probably overwriting a prior status, but that's OK.
if wksp.runRequest.buildRef != wksp.version:
writeFile(stepStatusDir / wksp.runRequest.buildRef & ".json",
$wksp.status)
wksp.outputHandler.sendStatusMsg(wksp.status)
proc ensureProjectDirsExist(cfg: StrawBossConfig, p: ProjectDef): void =
for subdir in ["configurations", "runs", "status", "artifacts"]:
let fullPath = cfg.buildDataDir / p.name / subdir
if not existsDir(fullPath):
createDir(fullPath)
# Data and configuration access
proc getProject*(cfg: StrawBossConfig, projectName: string): ProjectDef =
## Get a project definition by name from the service configuration
let candidates = cfg.projects.filterIt(it.name == projectName)
if candidates.len == 0:
raise newException(NotFoundException, "no project named " & projectName)
elif candidates.len > 1:
raise newException(NotFoundException, "multiple projects named " & projectName)
else: result = candidates[0]
proc setProject*(cfg: var StrawBossConfig, projectName: string, newDef: ProjectDef): void =
## Add a project definition to the service configuration
var found = false
for idx in 0..<cfg.projects.len:
if cfg.projects[idx].name == projectName:
cfg.projects[idx] = newDef
found = true
break
if not found: cfg.projects.add(newDef)
proc listVersions*(cfg: StrawBossConfig, projectName: string): seq[string] =
## List the versions that have been built for a project.
let project = cfg.getProject(projectName)
ensureProjectDirsExist(cfg, project)
let versionFiles = filesMatching(
cfg.buildDataDir / project.name / "configurations/*.json")
result = versionFiles.map(proc(s: string): string =
let slashIdx = s.rfind('/')
result = s[(slashIdx + 1)..^6])
proc getBuildStatus*(cfg: StrawBossConfig,
projectName, stepName, buildRef: string): BuildStatus =
let project = cfg.getProject(projectName)
let statusFile = cfg.buildDataDir / project.name / "status" /
stepName / buildRef & ".json"
if not existsFile(statusFile):
raise newException(NotFoundException,
stepName & " has never been built for " & projectName & "@" & buildRef)
result = loadBuildStatus(statusFile)
proc listArtifacts*(cfg: StrawBossConfig,
projectName, stepName, version: string): seq[string] =
## List the artifacts that have been built for a step.
let project = cfg.getProject(projectName)
ensureProjectDirsExist(cfg, project)
let buildStatus = cfg.getBuildStatus(projectName, stepName, version)
if buildStatus.state != BuildState.complete:
raise newException(NotFoundException, "step " & stepName &
" has never been successfully built for " & projectName & "@" & version)
result = filesMatching(
cfg.buildDataDir / project.name / "artifacts" / stepName / version / "*")
.mapIt(it.extractFilename)
proc getArtifactPath*(cfg: StrawBossConfig,
projectName, stepName, version, artifactName: string): string =
let artifacts = cfg.listArtifacts(projectName, stepName, version)
if not artifacts.contains(artifactName):
raise newException(NotFoundException, "no artifact named " &
artifactName & " exists for step " & stepName & " in project " &
projectName & "@" & version)
result = cfg.buildDataDir / projectName / "artifacts" / stepName / version / artifactName
proc existsRun*(cfg: StrawBossConfig, projectName, runId: string): bool =
existsFile(cfg.buildDataDir / projectName / "runs" / runId & ".request.json")
proc getRun*(cfg: StrawBossConfig, projectName, runId: string): Run =
let project = cfg.getProject(projectName)
let runsPath = cfg.buildDataDir / project.name / "runs"
try: result = Run(
id: parseUUID(runId),
request: loadRunRequest(runsPath / runId & ".request.json"),
status: loadBuildStatus(runsPath / runId & ".status.json"))
except: raiseEx "unable to load run information for id " & runId
proc listRuns*(cfg: StrawBossConfig, projectName: string): seq[Run] =
## List the runs that have been performed for a project.
let project = cfg.getProject(projectName)
ensureProjectDirsExist(cfg, project)
let runsPath = cfg.buildDataDir / project.name / "runs"
let reqPaths = filesMatching(runsPath / "*.request.json")
result = reqPaths.map(proc(reqPath: string): Run =
let runId = reqPath[(runsPath.len + 1)..^14]
result = Run(
id: parseUUID(runId),
request: loadRunRequest(reqPath),
status: loadBuildStatus(runsPath / runId & ".status.json")))
proc getLogs*(cfg: StrawBossConfig, projectname, runId: string): RunLogs =
let project = cfg.getProject(projectName)
let runsPath = cfg.buildDataDir / project.name / "runs"
try: result = RunLogs(
runId: parseUUID(runId),
stdout: toSeq(lines(runsPath / runId & ".stdout.log")),
stderr: toSeq(lines(runsPath / runId & ".stderr.log")))
except: raiseEx "unable to load logs for run " & runId
proc getProjectConfig*(cfg: StrawBossConfig,
projectName, version: string): ProjectConfig =
let project = cfg.getProject(projectName)
ensureProjectDirsExist(cfg, project)
# If they didn't give us a version, let try to figure out what is the latest one.
var confFilePath: string
if version.len == 0:
let candidatePaths = filesMatching(
cfg.buildDataDir / project.name / "configurations/*.json")
if candidatePaths.len == 0:
raise newException(NotFoundException,
"no versions of this project have been built")
let modTimes = candidatePaths.mapIt(it.getLastModificationTime)
confFilePath = sorted(zip(candidatePaths, modTimes),
proc(a, b: tuple): int = cmp(a.b, b.b))[0].a
#cachedFilePath = sorted(zip(confFilePaths, modTimes),
# proc (a, b: tuple): int = cmp(a.b, b.b))[0].a
# If they did, let's try to load that
else:
confFilePath =
cfg.buildDataDir / project.name / "configurations" / version & ".json"
if not existsFile(confFilePath):
raise newException(NotFoundException,
projectName & " version " & version & " has never been built")
result = loadProjectConfig(confFilePath)
# Internal working methods.
proc setupProject(wksp: Workspace) =
wksp.sendMsg(lvlDebug, "Setting up project.")
# Clone the project into the $temp directory
let cloneArgs = @["clone", wksp.projectDef.repo, wksp.dir]
wksp.sendMsg(lvlDebug, "git " & $cloneArgs)
let cloneResult = exec("git", ".", cloneArgs, wksp.env, {poUsePath},
wksp.outputHandler)
if cloneResult != 0:
raiseEx "unable to clone repo for '" & wksp.projectDef.name & "'"
# Checkout the requested ref
let checkoutArgs = @["checkout", wksp.buildRef]
wksp.sendMsg(lvlDebug, "git " & $checkoutArgs)
let checkoutResult = exec("git", wksp.dir, checkoutArgs,
wksp.env, {poUsePath}, wksp.outputHandler)
if checkoutResult != 0:
raiseEx "unable to checkout ref " & wksp.buildRef &
" for '" & wksp.projectDef.name & "'"
# Find the strawboss project configuration
let projCfgFile = wksp.dir / wksp.projectDef.cfgFilePath
wksp.sendMsg(lvlDebug, "Looking for project configuration at '" & projCfgFile & "'")
if not existsFile(projCfgFile):
raiseEx "Cannot find strawboss project configuration in the project " &
"repo (expected at '" & wksp.projectDef.cfgFilePath & "')."
wksp.project = loadProjectConfig(projCfgFile)
# Merge in the project-defined env vars
for k, v in wksp.projectDef.envVars: wksp.env[k] = v
# Get the build version
let versionResult = execWithOutput(
wksp.project.versionCmd, # command
wksp.dir, # working dir
[], # args
wksp.env, # environment
{poUsePath, poEvalCommand}) # options
if versionResult.exitCode != 0:
raiseEx "Version command (" & wksp.project.versionCmd &
") returned non-zero exit code."
wksp.version = versionResult.output.strip
wksp.env["VERSION"] = wksp.version
proc doStep*(wksp: Workspace, step: Step): BuildStatus =
## Lower-level method to execute a given step within the context of a project
## workspace that is setup and configured. May be called recursively to
## satisfy step dependencies.
wksp.step = step
let artifactsDir = wksp.buildDataDir / "artifacts" / step.name / wksp.version
if not existsDir(artifactsDir): createDir(artifactsDir)
# Have we tried to build this before and are we caching the results?
let statusFilePath = wksp.buildDataDir / "status" / step.name /
wksp.version & ".json"
if existsFile(statusFilePath) and not step.dontSkip:
let prevStatus = loadBuildStatus(statusFilePath)
# If we succeeded last time, no need to rebuild
if prevStatus.state == BuildState.complete:
wksp.publishStatus(BuildState.stepComplete,
"Skipping step '" & step.name & "' for version '" & wksp.version &
"': already completed.")
return wksp.status
else:
wksp.sendMsg(
"Rebuilding failed step '" & step.name & "' for version '" &
wksp.version & "'.")
let SB_EXPECTED_VARS = ["VERSION"]
wksp.publishStatus(BuildState.running,
"running '" & step.name & "' for version " & wksp.version &
" from " & wksp.buildRef)
# Ensure all expected environment variables are present.
for k in (step.expectedEnv & @SB_EXPECTED_VARS):
if not wksp.env.hasKey(k):
raiseEx "step " & step.name & " failed: missing required env variable: " & k
# Ensure that artifacts in steps we depend on are present
# TODO: detect circular-references in dependency trees.
for dep in step.depends:
if not wksp.project.steps.hasKey(dep):
raiseEx step.name & " depends on " & dep &
" but there is no step named " & dep
let depStep = wksp.project.steps[dep]
# Run that step (may get skipped)
let runStatus = doStep(core.newCopy(wksp), depStep)
if not (runStatus.state == BuildState.stepComplete):
raiseEx "dependent step failed: " & depStep.name
wksp.sendMsg(lvlDebug, "dependent step '" & depStep.name &
"'completed, resuming '" & wksp.step.name & "'")
# Add the artifacts directory for the dependent step to our env so that
# further steps can reference it via $<stepname>_DIR
wksp.env[depStep.name & "_DIR"] = wksp.buildDataDir / "artifacts" /
dep / wksp.version
# Run the step command, piping in cmdInput
let stepCmd = wksp.resolveEnvVars(step.stepCmd)
let cmdName = if stepCmd.rfind("/") >= 0: stepCmd[(stepCmd.rfind("/") + 1)..^1]
else: stepCmd
wksp.sendMsg step.name & ": starting stepCmd: " & stepCmd
let cmdProc = startProcess(stepCmd,
wksp.dir / step.workingDir, [], wksp.env, {poUsePath, poEvalCommand})
let cmdInStream = inputStream(cmdProc)
# Replace env variables in step cmdInput as we pipe it in
for line in step.cmdInput: cmdInStream.writeLine(wksp.resolveEnvVars(line))
cmdInStream.flush()
cmdInStream.close()
let cmdResult = waitFor(cmdProc, wksp.outputHandler, cmdName)
if cmdResult != 0:
raiseEx "step " & step.name & " failed: step command returned non-zero exit code"
# Gather the output artifacts (if we have any)
wksp.sendMsg "artifacts: " & $step.artifacts
if step.artifacts.len > 0:
for a in step.artifacts:
let artifactPath = wksp.resolveEnvVars(a)
let artifactName = artifactPath[(artifactPath.rfind("/")+1)..^1]
try:
wksp.sendMsg "copy " &
wksp.dir / step.workingDir / artifactPath & " -> " &
artifactsDir / artifactName
copyFileWithPermissions(wksp.dir / step.workingDir / artifactPath,
artifactsDir / artifactName)
except:
raiseEx "step " & step.name & " failed: unable to copy artifact " &
artifactPath & ":\n" & getCurrentExceptionMsg()
wksp.publishStatus(BuildState.stepComplete, "step " & step.name & " complete")
result = wksp.status
proc run*(cfg: StrawBossConfig, req: RunRequest,
outputHandler: HandleProcMsgCB = nil): BuildStatus =
## Execute a RunReuest given the StrawBoss configuration. This is the main
## entrypoint to running a build step.
result = BuildStatus(
runId: $req.runId,
state: BuildState.setup,
details: "initializing build workspace",
version: "")
outputHandler.sendStatusMsg(result)
var wksp: Workspace
try:
# Find the project definition
let projectDef = cfg.getProject(req.projectName)
# Make sure the build data directories for this project exist.
ensureProjectDirsExist(cfg, projectDef)
# Update our run status
let runDir = cfg.buildDataDir / projectDef.name / "runs"
writeFile(runDir / $req.runId & ".status.json", $result)
# Read in the existing system environment
var env = loadEnv()
env["GIT_DIR"] = ".git"
# Make sure we have a workspace directory
assert req.workspaceDir.isAbsolute
if not existsDir(req.workspaceDir): createDir(req.workspaceDir)
# Setup our STDOUT and STDERR files
let stdoutFile = open(runDir / $req.runId & ".stdout.log", fmWrite)
let stderrFile = open(runDir / $req.runId & ".stderr.log", fmWrite)
let logFilesOH = makeProcMsgHandler(stdoutFile, stderrFile)
wksp = Workspace(
buildDataDir: cfg.buildDataDir / projectDef.name,
buildRef:
if req.buildRef.len > 0: req.buildRef
else: projectDef.defaultBranch,
dir: req.workspaceDir,
env: env,
logLevel: cfg.logLevel,
openedFiles: @[stdoutFile, stderrFile],
outputHandler: combineProcMsgHandlers(outputHandler, logFilesOH),
project: ProjectConfig(),
projectDef: projectDef,
runRequest: req,
status: result,
step: Step(),
version: "")
except:
when not defined(release): echo getCurrentException().getStackTrace()
result = BuildStatus(runId: $req.runId, state: BuildState.failed,
details: getCurrentExceptionMsg(), version: "")
try: outputHandler.sendStatusMsg(result)
except: discard ""
return
try:
# Clone the repo and setup the working environment
wksp.publishStatus(BuildState.setup,
"cloning project repo and preparing to run '" & req.stepName & "'")
wksp.setupProject()
# Update our cache of project configurations.
# TODO: what happens if this fails?
copyFileWithPermissions(
wksp.dir / wksp.projectDef.cfgFilePath,
wksp.buildDataDir / "configurations" / wksp.version & ".json")
# Find the requested step
if not wksp.project.steps.hasKey(req.stepName):
raiseEx "no step name '" & req.stepName & "' for " & req.projectName
var step = wksp.project.steps[req.stepName]
if req.forceRebuild: step.dontSkip = true
var buildStatus = doStep(wksp, step)
if buildStatus.state == BuildState.stepComplete:
buildStatus.state = BuildState.complete
wksp.publishStatus(buildStatus.state, "all steps complete")
result = wksp.status
except:
when not defined(release): echo getCurrentException().getStackTrace()
let msg = getCurrentExceptionMsg()
try:
wksp.publishStatus(BuildState.failed, msg)
result = wksp.status
except:
result = BuildStatus(runId: $req.runId, state: BuildState.failed,
details: msg, version: "")
try: outputHandler.sendStatusMsg(result)
except: discard ""
finally:
if wksp != nil:
# Close open files
for f in wksp.openedFiles:
try: close(f)
except: discard ""
proc spawnWorker*(cfg: StrawBossConfig, req: RunRequest):
tuple[status: BuildStatus, worker: Worker] =
# Find the project definition (will throw appropriate exceptions)
let projectDef = cfg.getProject(req.projectName)
let runDir = cfg.buildDataDir / projectDef.name / "runs"
let reqFile = runDir / $req.runId & ".request.json"
let statusFile = runDir / $req.runId & ".status.json"
try:
# Make sure the build data directories for this project exist.
ensureProjectDirsExist(cfg, projectDef)
# Save the run request
writeFile(reqFile, $req)
# Write the initial build status (queued).
let queuedStatus = BuildStatus(
runId: $req.runId,
state: BuildState.queued,
details: "request queued for execution",
version: "")
writeFile(statusFile, $queuedStatus)
var args = @["run", reqFile, "-c", cfg.filePath]
debug "Launching worker: " & cfg.pathToExe & " " & args.join(" ")
let worker = Worker(
runId: req.runId,
projectName: projectDef.name,
process: startProcess(cfg.pathToExe, ".", args, loadEnv(), {poUsePath}))
result = (queuedStatus, worker)
except:
let exMsg = "run request rejected: " & getCurrentExceptionMsg()
try:
writeFile(statusFile,
$(BuildStatus(runId: $req.runId, state: BuildState.rejected,
details: exMsg, version: "")))
except: discard ""
raiseEx exMsg

View File

@ -0,0 +1,469 @@
import asyncdispatch, bcrypt, cliutils, jester, json, jwt, logging, md5,
options, os, osproc, sequtils, strutils, tempfile, times, unittest, uuids
from mimetypes import getMimeType
from asyncfile import openAsync, readToStream, close
from asyncnet import send
from re import re, find
from timeutils import trimNanoSec
import ./configuration, ./core, ./version
type
Session = object
user*: UserRef
issuedAt*, expires*: Time
#const ISO_TIME_FORMAT = "yyyy-MM-dd'T'HH:mm:ss"
const JSON = "application/json"
proc newSession*(user: UserRef): Session =
result = Session(
user: user,
issuedAt: getTime().local.trimNanoSec.toTime,
expires: daysForward(7).trimNanoSec.toTime)
template halt(code: HttpCode,
headers: RawHeaders,
content: string): typed =
## Immediately replies with the specified request. This means any further
## code will not be executed after calling this template in the current
## route.
bind TCActionSend, newHttpHeaders
result[0] = CallbackAction.TCActionSend
result[1] = code
result[2] = some(headers)
result[3] = content
result.matched = true
break allRoutes
template jsonResp(code: HttpCode, details: string = "", headers: RawHeaders = @{:} ) =
halt(
code,
headers & @{"Content-Type": JSON},
$(%* {
"statusCode": code.int,
"status": $code,
"details": details
})
)
template json500Resp(ex: ref Exception, details: string = ""): void =
when not defined(release): debug ex.getStackTrace()
error details & ":\n" & ex.msg
jsonResp(Http500)
proc toJWT*(cfg: StrawBossConfig, session: Session): string =
## Make a JST token for this session.
var jwt = JWT(
header: JOSEHeader(alg: HS256, typ: "jwt"),
claims: toClaims(%*{
"sub": session.user.name,
"iat": session.issuedAt.toUnix.int,
"exp": session.expires.toUnix.int }))
jwt.sign(cfg.authSecret)
result = $jwt
proc fromJWT*(cfg: StrawBossConfig, strTok: string): Session =
## Validate a given JWT and extract the session data.
let jwt = toJWT(strTok)
var secret = cfg.authSecret
if not jwt.verify(secret): raiseEx "Unable to verify auth token."
jwt.verifyTimeClaims()
# Find the user record (if authenticated)
let username = jwt.claims["sub"].node.str
let users = cfg.users.filterIt(it.name == username)
if users.len != 1: raiseEx "Could not find session user."
result = Session(
user: users[0],
issuedAt: fromUnix(jwt.claims["iat"].node.num),
expires: fromUnix(jwt.claims["exp"].node.num))
proc extractSession(cfg: StrawBossConfig, request: Request): Session =
## Helper to extract a session from a reqest.
# Find the auth header
if not request.headers.hasKey("Authorization"):
raiseEx "No auth token."
# Read and verify the JWT token
let headerVal = request.headers["Authorization"]
if not headerVal.startsWith("Bearer "):
raiseEx "Invalid Authentication type (only 'Bearer' is supported)."
result = fromJWT(cfg, headerVal[7..^1])
proc hashPwd*(pwd: string, cost: int8): string =
let salt = genSalt(cost)
result = hash(pwd, salt)
proc validatePwd*(u: UserRef, givenPwd: string): bool =
let salt = u.hashedPwd[0..28] # TODO: magic numbers
result = compare(u.hashedPwd, hash(givenPwd, salt))
proc makeAuthToken*(cfg: StrawBossConfig, uname, pwd: string): string =
## Given a username and pwd, validate the combination and generate a JWT
## token string.
if uname.len == 0 or pwd.len == 0:
raiseEx "fields 'username' and 'password' required"
# find the user record
let users = cfg.users.filterIt(it.name == uname)
if users.len != 1: raiseEx "invalid username or password"
let user = users[0]
if not validatePwd(user, pwd): raiseEx "invalid username or password"
let session = newSession(user)
result = toJWT(cfg, session)
proc makeApiKey*(cfg: StrawBossConfig, uname: string): string =
## Given a username, make an API token (JWT token string that does not
## expire). Note that this does not validate the username/pwd combination. It
## is not intended to be exposed publicly via the API, but serve as a utility
## function for an administrator to setup a unsupervised account (git access
## for example).
if uname.len == 0: raiseEx "no username given"
# find the user record
let users = cfg.users.filterIt(it.name == uname)
if users.len != 1: raiseEx "invalid username"
let session = Session(
user: users[0],
issuedAt: getTime(),
expires: daysForward(365 * 1000).toTime())
result = toJWT(cfg, session);
template checkAuth() =
## Check this request for authentication and authorization information.
## Injects the session into the running context. If the request is not
## authorized, this template returns an appropriate 401 response.
var session {.inject.}: Session
try: session = extractSession(cfg, request)
except:
debug "Auth failed: " & getCurrentExceptionMsg()
jsonResp(Http401, "Unauthorized", @{"WWW-Authenticate": "Bearer"})
proc start*(cfg: StrawBossConfig): void =
var stopFuture = newFuture[void]()
var workers: seq[Worker] = @[]
settings:
port = Port(cfg.port)
appName = "/api"
routes:
get "/version":
resp($(%("strawboss v" & SB_VERSION)), JSON)
post "/auth-token":
var uname, pwd: string
try:
let jsonBody = parseJson(request.body)
uname = jsonBody["username"].getStr
pwd = jsonBody["password"].getStr
except: jsonResp(Http400)
try:
let authToken = makeAuthToken(cfg, uname, pwd)
resp($(%authToken), JSON)
except:
jsonResp(Http401, getCurrentExceptionMsg())
if ctx.cfg.debug: echo getStackTrace()
get "/verify-auth":
checkAuth()
resp(Http200, $(%*{ "username": session.user.name }), JSON)
get "/projects":
## List project summaries (ProjectDefs only)
checkAuth()
resp($(%cfg.projects), JSON)
post "/projects":
## Create a new project definition
checkAuth()
# TODO
jsonResp(Http501)
get "/project/@projectName":
## Return a project's configuration, as well as it's versions.
checkAuth()
# Make sure we know about that project
var projDef: ProjectDef
try: projDef = cfg.getProject(@"projectName")
except:
try: raise getCurrentException()
except NotFoundException:
jsonResp(Http404, getCurrentExceptionMsg())
except:
let msg = "unable to load project definition for project " & @"projectName"
json500Resp(getCurrentException(), msg)
var projConf: ProjectConfig
try: projConf = getProjectConfig(cfg, @"projectName", "")
except: discard ""
let respJson = newJObject()
respJson["definition"] = %projDef
respJson["versions"] = %listVersions(cfg, @"projectName")
if projConf.name.len > 0:
respJson["latestConfig"] = %projConf
resp(pretty(respJson), JSON)
get "/project/@projectName/versions":
## Get a list of all versions that we have built
checkAuth()
try: resp($(%listVersions(cfg, @"projectName")), JSON)
except:
try: raise getCurrentException()
except NotFoundException:
jsonResp(Http404, getCurrentExceptionMsg())
except:
let msg = "unable to list versions for project " & @"projectName"
json500Resp(getCurrentException(), msg)
get "/project/@projectName/version/@version?":
## Get a detailed project record including step definitions (ProjectConfig).
checkAuth()
# Make sure we know about that project
try: resp($(%getProjectConfig(cfg, @"projectName", @"version")), JSON)
except: jsonResp(Http404, getCurrentExceptionMsg())
get "/project/@projectName/runs":
## List all runs
checkAuth()
try: resp($(%listRuns(cfg, @"projectName")), JSON)
except: jsonResp(Http404, getCurrentExceptionMsg())
get "/project/@projectName/runs/active":
## List all currently active runs
checkAuth()
var details = ""
try:
let activeRuns = workers
.filterIt(it.process.running and it.projectName == @"projectName")
.mapIt(cfg.getRun(@"projectName", $it.runId));
resp($(%activeRuns), JSON)
except NotFoundException:
jsonResp(Http404, getCurrentExceptionMsg())
except:
json500Resp(getCurrentException(), "problem loading active runs")
get "/project/@projectName/run/@runId":
## Details for a specific run
checkAuth()
# Make sure we know about that project
try: discard cfg.getProject(@"projectName")
except: jsonResp(Http404, getCurrentExceptionMsg())
if not existsRun(cfg, @"projectName", @"runId"):
jsonResp(Http404, "no such run for project")
try: resp($getRun(cfg, @"projectName", @"runId"), JSON)
except:
json500Resp(getCurrentException(),
"unable to load run details for project " & @"projectName" &
" run " & @"runId")
get "/project/@projectName/run/@runId/logs":
## Get logs from a specific run
checkAuth()
try: discard cfg.getProject(@"projectName")
except:
jsonResp(Http404, getCurrentExceptionMsg())
if not existsRun(cfg, @"projectName", @"runId"):
jsonResp(Http404, "no such run for project")
try: resp($getLogs(cfg, @"projectName", @"runId"))
except:
json500Resp(getCurrentException(),
"unable to load run logs for " & @"projectName" & " run " & @"runId")
get "/project/@projectName/step/@stepName/artifacts/@version":
## Get the list of artifacts that were built for
checkAuth()
debug "Matched artifacts list request: " & $(%*{
"project": @"projectName",
"step": @"stepName",
"version": @"version"
})
try: resp($(%listArtifacts(cfg, @"projectName", @"stepName", @"version")), JSON)
except:
try: raise getCurrentException()
except NotFoundException:
jsonResp(Http404, getCurrentExceptionMsg())
except:
json500Resp(getCurrentException(), "unable to list artifacts for " &
@"projectName" & ":" & @"stepName" & "@" & @"buildRef")
get "/project/@projectName/step/@stepName/artifact/@version/@artifactName":
## Get a specific artifact that was built.
checkAuth()
var artifactPath: string
try: artifactPath = getArtifactPath(cfg,
@"projectName", @"stepName", @"version", @"artifactName")
except:
try: raise getCurrentException()
except NotFoundException:
jsonResp(Http404, getCurrentExceptionMsg())
except:
json500Resp(getCurrentException(), "unable to check artifact path for " &
@"projectName" & ":" & @"stepName" & "@" & @"version")
enableRawMode
debug "Preparing: " & artifactPath
let fileSize = getFileSize(artifactPath)
let mimetype = request.settings.mimes.getMimetype(artifactPath.splitFile.ext[1 .. ^1])
if fileSize < 10_000_000: # 10 mb
var file = readFile(artifactPath)
var hashed = getMD5(file)
# If the user has a cached version of this file and it matches our
# version, let them use it
if request.headers.hasKey("If-None-Match") and request.headers["If-None-Match"] == hashed:
resp(Http304)
else:
resp(Http200, [
("Content-Disposition", "; filename=\"" & @"artifactName" & "\""),
("Content-Type", mimetype),
("ETag", hashed )], file)
else:
let headers = @{
"Content-Disposition": "; filename=\"" & @"artifactName" & "\"",
"Content-Type": mimetype,
"Content-Length": $fileSize
}
request.sendHeaders(Http200, headers)
var fileStream = newFutureStream[string]("sendStaticIfExists")
var file = openAsync(artifactPath, fmRead)
# Let `readToStream` write file data into fileStream in the
# background.
asyncCheck file.readToStream(fileStream)
# The `writeFromStream` proc will complete once all the data in the
# `bodyStream` has been written to the file.
while true:
let (hasValue, value) = await fileStream.read()
if hasValue: request.send(value)
else: break
file.close()
get "/project/@projectName/step/@stepName/status/@buildRef":
## Get detailed information about the status of a step (assuming it has been built)
checkAuth()
try: resp($cfg.getBuildStatus(@"projectName", @"stepName", @"buildRef"), JSON)
except:
try: raise getCurrentException()
except NotFoundException: jsonResp(Http404, getCurrentExceptionMsg())
except:
json500Resp(getCurrentException(), "unable to load the build state for " &
@"projectName" & ":" & @"stepName" & "@" & @"buildRef")
#get "/project/@projectName/step/@stepName/status/@buildRef.svg":
## Get an image representing the status of a build
## TODO: how do we want to handle auth for this? Unlike
#checkAuth(): if not authed: return true
post "/project/@projectName/step/@stepName/run/@buildRef?":
# Kick off a run
checkAuth()
let runRequest = RunRequest(
runId: genUUID(),
projectName: @"projectName",
stepName: @"stepName",
buildRef: if @"buildRef" != "": @"buildRef" else: "",
timestamp: getTime().local,
forceRebuild: false) # TODO support this with optional query params
# TODO: instead of immediately spawning a worker, add the request to a
# queue to be picked up by a worker. Allows capping the number of worker
# prcesses, distributing, etc.
try:
let (status, worker) = spawnWorker(cfg, runRequest)
workers.add(worker)
resp($Run(
id: runRequest.runId,
request: runRequest,
status: status), JSON)
except:
try: raise getCurrentException()
except NotFoundException: jsonResp(Http404, getCurrentExceptionMsg())
except: jsonResp(Http400, getCurrentExceptionMsg())
post "/service/debug/stop":
if not cfg.debug: jsonResp(Http404)
else:
let shutdownFut = sleepAsync(100)
shutdownFut.callback = proc(): void = complete(stopFuture)
resp($(%"shutting down"), JSON)
get re".*":
jsonResp(Http404, "URL [" & request.path & "] is not present on this server.")
post re".*":
jsonResp(Http404)
proc performMaintenance(cfg: StrawBossConfig): void =
# Prune workers
workers = workers.filterIt(it.process.running())
debug "Performing maintanance: " & $len(workers) & " active workers after pruning."
let fut = sleepAsync(cfg.maintenancePeriod)
fut.callback =
proc(): void =
callSoon(proc(): void = performMaintenance(cfg))
info "StrawBoss is bossing people around."
callSoon(proc(): void = performMaintenance(cfg))
waitFor(stopFuture)

View File

@ -0,0 +1,2 @@
const SB_VERSION* = "0.5.1"

View File

@ -0,0 +1,9 @@
[Unit]
Description=StrawBoss build server.
[Service]
Type=simple
User=strawboss
WorkingDirectory=/home/strawboss
ExecStart=/home/strawboss/strawboss
Restart=on-failure

View File

@ -0,0 +1,18 @@
{
"name": "dummy-project",
"versionCmd": "git describe --all --always",
"containerImage": "ubuntu",
"steps": {
"build": {
"containerImage": "alpine",
"depends": ["test"],
"workingDir": "dir1",
"stepCmd": "cust-build",
"artifacts": ["bin1", "doc1"],
"expectedEnv": ["VAR1"],
"dontSkip": true,
"cmdInput": ["test", "this"]
},
"test": { }
}
}

View File

@ -0,0 +1,20 @@
{
"artifactsRepo": "artifacts",
"authSecret": "change me",
"debug": true,
"users": [
{ "name": "bob@builder.com", "hashedPwd": "$2a$11$lVZ9U4optQMhzPh0E9A7Yu6XndXblUF3gCa.zmEvJy4F.4C4718b." },
{ "name": "sam@sousa.com", "hashedPwd": "testvalue" }
],
"port": 8180,
"pwdCost": 11,
"projects": [
{ "name": "dummy-project",
"repo": "/non-existent/dir",
"cfgFilePath": "strawhat.json",
"defaultBranch": "deploy",
"envVars": { "VAR1": "value" }
},
{ "name": "test-project",
"repo": "" } ]
}

View File

@ -0,0 +1,5 @@
{
"runId": "90843e0c-6113-4462-af33-a89ff9731031",
"state": "failed",
"details": "some very good reason"
}

View File

@ -0,0 +1,19 @@
import tempfile, times, unittest, untar
from langutils import sameContents
import ../testutil
import ../../../main/nim/strawbosspkg/configuration
let cfgFilePath = "src/test/json/strawboss.config.json"
let cfg = loadStrawBossConfig(cfgFilePath)
let TIMEOUT = 2.minutes
suite "strawboss core":
# Suite setup: extract test project
let testProjTempDir = mkdtemp()
let testProjTarFile = newTarFile("src/test/test-project.tar.gz")
let testProjName = "test-project"
testProjTarFile.extract(testProjTempDir)

View File

@ -0,0 +1,250 @@
import cliutils, httpclient, json, os, osproc, sequtils, strutils, tempfile,
times, unittest, untar, uuids
from langutils import sameContents
from algorithm import sorted
import ../testutil
import ../../../main/nim/strawbosspkg/configuration
import ../../../main/nim/strawbosspkg/core
let apiBase = "http://localhost:8180/api"
let cfgFilePath = "src/test/json/strawboss.config.json"
let cfg = loadStrawBossConfig(cfgFilePath)
let TIMEOUT = 2.minutes
# Util template intended for use to manually review test case.
# Inserting into a test case will prevent the test case from cleaning up it's
# working files and echo the command to start StrawBoss using that test's
# configuration and working files.
template keepEnv(): untyped =
preserveEnv = true
echo "artifacts dir: " & tempBuildDataDir
echo "strawboss serve -c " & tempCfgPath
suite "strawboss server":
# Suite setup: extract test project
let testProjTempDir = mkdtemp()
let testProjTarFile = newTarFile("src/test/test-project.tar.gz")
let testProjName = "test-project"
testProjTarFile.extract(testProjTempDir)
# per-test setup: spin up a fresh strawboss instance
setup:
let tempBuildDataDir = mkdtemp()
let (_, tempCfgPath) = mkstemp()
var preserveEnv = false
# copy our test config
var newCfg = cfg
newCfg.buildDataDir = tempBuildDataDir
# update the repo string for the extracted test project
var testProjDef = newCfg.getProject(testProjName)
testProjDef.repo = testProjTempDir
newCfg.setProject(testProjName, testProjDef)
# save the updated config and start the strawboss instance using it
writeFile(tempCfgPath, $newCfg)
let serverProcess = startProcess("./strawboss", ".",
@["serve", "-c", tempCfgPath], loadEnv(), {poUsePath})
# give the server time to spin up
sleep(200)
teardown:
discard newAsyncHttpClient().post(apiBase & "/service/debug/stop")
if not preserveEnv:
removeDir(tempBuildDataDir)
removeFile(tempCfgPath)
# give the server time to spin down but kill it after that
sleep(200)
if serverProcess.running: kill(serverProcess)
test "handle missing project configuration":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
let resp = http.get(apiBase & "/projects/" & cfg.projects[0].name)
check resp.status.startsWith("404")
test "gives 404 when no versions built":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
let resp = http.get(apiBase & "/projects/" & testProjName & "/versions")
check resp.status.startsWith("404")
test "GET /api/project/@projectName/versions":
let cachedConfsDir = tempBuildDataDir & "/" & testProjName & "/configurations"
let expectedVersions = @["alpha", "beta", "1.0.0", "1.0.1"]
# Touch configuration files
createDir(cachedConfsDir)
for v in expectedVersions:
var f: File
check open(f, cachedConfsDir & "/" & v & ".json", fmWrite)
close(f)
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
let resp = http.get(apiBase & "/project/" & testProjName & "/versions")
let returnedVersions = parseJson(resp.body).getElems.mapIt(it.getStr)
check sameContents(expectedVersions, returnedVersions)
test "run a successful build with artifacts":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
let resp = http.post(apiBase & "/project/" & testProjName & "/step/build/run/0.2.1")
check resp.status.startsWith("200")
# Check that the run was queued
let queuedRun = parseRun(parseJson(resp.body))
check queuedRun.status.state == BuildState.queued
# Wait for the build to complete
let completedRun = http.waitForBuild(apiBase, testProjname, $queuedRun.id)
# check that the run directory, run request, status, and output logs exist
let runsDir = tempBuildDataDir & "/" & testProjName & "/runs"
let runId = $completedRun.id
check existsDir(runsDir)
for suffix in [".request.json", ".status.json", ".stdout.log", ".stderr.log"]:
check existsFile(runsDir & "/" & runId & suffix)
# check that the project directory has been created in the artifacts repo
let runArtifactsDir = tempBuildDataDir & "/" & testProjName & "/artifacts/build/0.2.1"
check existsDir(runArtifactsDir)
# check that the build step status file has been created
let statusFile = tempBuildDataDir & "/" & testProjName & "/status/build/0.2.1.json"
check fileExists(statusFile)
# check that the status is complete
var status = loadBuildStatus(statusFile)
check status.state == BuildState.complete
# check that the artifacts we expect are present
let binFile = runArtifactsDir & "/test_project"
check existsFile(binFile)
test "run a multi-step build":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
# Run the "test" step (depends on "build")
var resp = http.post(apiBase & "/project/" & testProjname & "/step/test/run/0.2.1")
check resp.status.startsWith("200")
let queuedRun = parseRun(parseJson(resp.body))
let completedRun = http.waitForBuild(apiBase, testProjName, $queuedRun.id)
# there should be successful status files for both the build and test steps
for step in [("build", BuildState.stepComplete), ("test", BuildState.complete)]:
let statusFile = tempBuildDataDir & "/" & testProjName & "/status/" & step[0] & "/0.2.1.json"
check fileExists(statusFile)
let status = loadBuildStatus(statusFile)
check status.state == step[1]
test "run a build in docker":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
# Run the "build-docker" step
var resp = http.post(apiBase & "/project/" & testProjName & "/step/build-docker/run/0.3.0")
check resp.status.startsWith("200")
let queuedRun = parseRun(parseJson(resp.body))
check queuedRun.status.state == BuildState.queued
# Wait for the build to complete
let completedRun = http.waitForBuild(apiBase, testProjName, $queuedRun.id)
# check that the run directory, run request, status, and output logs exist
let runsDir = tempBuildDataDir & "/" & testProjName & "/runs"
let runId = $completedRun.id
check existsDir(runsDir)
for suffix in [".request.json", ".status.json", ".stdout.log", ".stderr.log"]:
check existsFile(runsDir & "/" & runId & suffix)
# check that the project directory has been created in the artifacts repo
let runArtifactsDir = tempBuildDataDir & "/" & testProjName & "/artifacts/build-docker/0.3.0"
check existsDir(runArtifactsDir)
# check that the build step status file has been created
let statusFile = tempBuildDataDir & "/" & testProjName & "/status/build-docker/0.3.0.json"
check fileExists(statusFile)
# check that the status is complete
var status = loadBuildStatus(statusFile)
check status.state == BuildState.complete
# check that the artifacts we expect are present
let binFile = runArtifactsDir & "/test_project"
check existsFile(binFile)
test "run a multi-step docker-based build":
let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
# Run the "test" step (depends on "build")
var resp = http.post(apiBase & "/project/" & testProjname & "/step/test-docker/run/0.3.0")
check resp.status.startsWith("200")
let queuedRun = parseRun(parseJson(resp.body))
let completedRun = http.waitForBuild(apiBase, testProjName, $queuedRun.id)
# there should be successful status files for both the build and test steps
for step in [("build-docker", BuildState.stepComplete), ("test-docker", BuildState.complete)]:
let statusFile = tempBuildDataDir & "/" & testProjName & "/status/" & step[0] & "/0.3.0.json"
check fileExists(statusFile)
let status = loadBuildStatus(statusFile)
check status.state == step[1]
# TODO
#test "already completed steps should not be rebuilt":
# let http = newAuthenticatedHttpClient(apibase, "bob@builder.com", "password")
# let runArtifactsDir = tempBuildDataDir & "/" & testProjName & "/artifacts/build/0.2.1"
# let exeModTime = getLastModificationTime(runArtifactsDir & "/test_project")
# Run the "build" step
# Kick off a build that depends on "build" (which was run in the last test)
test "kick off multiple runs and check the list of active runs via the API":
let http = newAuthenticatedHttpClient(apiBase, "bob@builder.com", "password")
# Kick off multiple runs of the "long-running" job
let queuedRuns = toSeq((1..3)).map(proc (idx: int): Run =
let resp = http.post(apiBase & "/project/" & testProjName & "/step/long-running/run/0.3.1")
check resp.status.startsWith("200")
return parseRun(parseJson(resp.body)))
# Collect run ids.
let runIds = queuedRuns.mapIt($(it.id)).sorted(cmpIgnoreCase)
# Check on the runs
let getActiveResp = http.get(apiBase & "/project/" & testProjName & "/runs/active")
check getActiveResp.status.startsWith("200")
let activeRuns = parseJson(getActiveResp.body).getElems().mapIt(parseRun(it))
let activeRunIds = activeRuns.mapIt($(it.id)).sorted(cmpIgnoreCase)
# Make sure we see all runs in the active state.
check runIds == activeRunIds
let completedRuns = runIds.map(proc (runId: string): Run =
return http.waitForBuild(apiBase, testProjName, runId))
# Make sure all are completed and all are accounted for
check completedRuns.allIt(it.status.state == BuildState.complete)
check completedRuns.mapIt($(it.id)).sorted(cmpIgnoreCase) == runIds;
# Check that there are no more active runs
let getActiveResp2 = http.get(apiBase & "/project/" & testProjName & "/runs/active")
let remainingActiveRuns = parseJson(getActiveResp2.body).getElems().mapIt(parseRun(it))
check remainingActiveRuns.len == 0
# Last-chance catch to kill the server in case some test err'ed and didn't
# reach it's teardown handler
discard newAsyncHttpClient().post(apiBase & "/service/debug/stop")
# Also, delete the extracted test project "source" repo
removeDir(testProjTempDir)

View File

@ -0,0 +1,3 @@
import unittest
import ./functional/tserver

View File

@ -0,0 +1,4 @@
import unittest
import ./unit/tserver
import ./unit/tconfiguration

46
src/test/nim/testutil.nim Normal file
View File

@ -0,0 +1,46 @@
import httpclient, json, os, strutils, times
import ../../main/nim/strawbosspkg/core
import ../../main/nim/strawbosspkg/configuration
proc newAuthenticatedHttpClient*(apiBase, uname, pwd: string): HttpClient =
result = newHttpClient()
let authResp = result.post(apiBase & "/auth-token", $(%*{"username": uname, "password": pwd}))
assert authResp.status.startsWith("200")
result.headers = newHttpHeaders({"Authorization": "Bearer " & parseJson(authResp.body).getStr})
proc waitForBuild*(client: HttpClient, apiBase, projectName, runId: string,
expectedState = BuildState.complete,
failedState = BuildState.failed,
timeout = 10): Run =
let startTime = epochTime()
var run: Run
#echo "Waiting for '" & $expectedState & "' from run:\n\t" &
# apiBase & "/project/" & projectName & "/run/" & runId
while true:
var curElapsed = epochTime() - startTime
#echo "Checking (" & $curElapsed & " has passed)."
if curElapsed > toFloat(timeout):
raise newException(Exception, "Timeout exceeded waiting for build.")
let resp = client.get(apiBase & "/project/" & projectName & "/run/" & runId)
#echo "Received resp:\n\n" & $resp.status & "\n\n" & $resp.body
if not resp.status.startsWith("200"):
raise newException(IOError, "Unable to retrieve status. Received response: " & resp.body)
run = parseRun(parseJson(resp.body))
if run.status.state == failedState:
raise newException(IOError, "Run transitioned to failed state '" & $failedState & "'")
if run.status.state == expectedState:
return run
sleep(200)

View File

@ -0,0 +1,152 @@
import json, strtabs, times, tables, unittest, uuids
from langutils import sameContents
from timeutils import trimNanoSec
import ../../../main/nim/strawbosspkg/configuration
suite "load and save configuration objects":
# suite setup & common data
let testProjDefStr = """{ "name": "dummy-project", "repo":
"/non-existent/dir",
"cfgFilePath": "strawhat.json",
"defaultBranch": "deploy",
"envVars": { "VAR1": "value" } }"""
let testProjDef = ProjectDef(
name: "dummy-project",
repo: "/non-existent/dir",
cfgFilePath: "strawhat.json",
defaultBranch: "deploy",
envVars: newStringTable("VAR1", "value", modeCaseInsensitive))
test "parseRunRequest":
let rr1 = RunRequest(
runId: genUUID(),
projectName: testProjDef.name,
stepName: "build",
buildRef: "master",
workspaceDir: "/no-real/dir",
timestamp: getTime().local.trimNanoSec,
forceRebuild: true)
let rrStr = $rr1
let rr2 = parseRunRequest(parseJson(rrStr))
check rr1 == rr2
test "parseProjectDef":
let pd = parseProjectDef(parseJson(testProjDefStr))
check:
pd.name == "dummy-project"
pd.repo == "/non-existent/dir"
pd.cfgFilePath == "strawhat.json"
pd.defaultBranch == "deploy"
pd.envVars.len == 1
pd.envVars.hasKey("VAR1")
pd.envVars["VAR1"] == "value"
test "ProjectDef ==":
let pd1 = parseProjectDef(parseJson(testProjDefStr))
check pd1 == testProjDef
test "ProjectDef != (name)":
var pd1 = testProjDef
pd1.name = "different"
check pd1 != testProjDef
test "ProjectDef != (repo)":
var pd1 = testProjDef
pd1.repo = "different"
check pd1 != testProjDef
test "ProjectDef != (cfgFilePath)":
var pd1 = testProjDef
pd1.cfgFilePath = "different"
check pd1 != testProjDef
test "ProjectDef != (defaultBranch)":
var pd1 = testProjDef
pd1.defaultBranch = "different"
check pd1 != testProjDef
test "loadStrawBossConfig":
let cfg = loadStrawBossConfig("src/test/json/strawboss.config.json")
let expectedUsers = @[UserRef(name: "bob@builder.com", hashedPwd: "testvalue"),
UserRef(name: "sam@sousa.com", hashedPwd: "testvalue")]
let expectedProjects = @[
ProjectDef(name: "dummy-project",
repo: "/non-existent/dir",
defaultBranch: "deploy",
cfgFilePath: "strawhat.json",
envVars: newStringTable("VAR1", "value", modeCaseSensitive)),
ProjectDef(name: "test-project",
repo: "",
defaultBranch: "master",
cfgFilePath: "strawboss.json",
envVars: newStringTable(modeCaseSensitive))]
check:
cfg.buildDataDir == "build-data"
cfg.authSecret == "change me"
cfg.pwdCost == 11
sameContents(expectedUsers, cfg.users)
sameContents(expectedProjects, cfg.projects)
test "loadProjectConfig":
let pc = loadProjectConfig("src/test/json/dummy-project.config.json")
check:
pc.name == "dummy-project"
pc.versionCmd == "git describe --all --always"
pc.containerImage == "ubuntu"
pc.steps.len == 2
# Explicitly set properties
pc.steps["build"].name == "build"
pc.steps["build"].dontSkip == true
pc.steps["build"].stepCmd == "cust-build"
pc.steps["build"].workingDir == "dir1"
pc.steps["build"].containerImage == "alpine"
sameContents(pc.steps["build"].artifacts, @["bin1", "doc1"])
sameContents(pc.steps["build"].depends, @["test"])
sameContents(pc.steps["build"].expectedEnv, @["VAR1"])
sameContents(pc.steps["build"].cmdInput, @["test", "this"])
# Step with defaulted properties
pc.steps["test"].name == "test"
pc.steps["test"].dontSkip == false
pc.steps["test"].stepCmd == "true"
pc.steps["test"].workingDir == "."
pc.steps["test"].containerImage.len == 0
sameContents(pc.steps["test"].artifacts, @[])
sameContents(pc.steps["test"].depends, @[])
sameContents(pc.steps["test"].expectedEnv, @[])
sameContents(pc.steps["test"].cmdInput, @[])
test "serialze StrawBossConfig to/from string":
let cfg = loadStrawBossConfig("src/test/json/strawboss.config.json")
let cfgStr = $cfg
check cfg == parseStrawBossConfig(parseJson(cfgStr))
test "%step":
let step = Step(
name: "build", stepCmd: "true", workingDir: "dirA",
artifacts: @[], depends: @["compile"], cmdInput: @[],
expectedEnv: @["CWD", "TERM"], dontSkip: true)
let stepJS = %step
for k in @["name", "stepCmd", "workingDir", "artifacts", "cmdInput",
"depends", "expectedEnv", "dontSkip"]:
check stepJS.hasKey(k)
test "loadBuildStatus":
let st = loadBuildStatus("src/test/json/test-status.json")
check:
st.runId == "90843e0c-6113-4462-af33-a89ff9731031"
st.state == BuildState.failed
st.details == "some very good reason"

View File

@ -0,0 +1,94 @@
import asyncdispatch, cliutils, httpclient, json, os, osproc, sequtils,
strutils, times, unittest
from langutils import sameContents
import ../testutil
import ../../../main/nim/strawbosspkg/configuration
import ../../../main/nim/strawbosspkg/server
import ../../../main/nim/strawbosspkg/version
let apiBase = "http://localhost:8180/api"
let cfgFilePath = "src/test/json/strawboss.config.json"
let cfg = loadStrawBossConfig(cfgFilePath)
let testuser = UserRef( # note: needs to correspond to an actual user
name: "bob@builder.com",
hashedPwd: "$2a$11$lVZ9U4optQMhzPh0E9A7Yu6XndXblUF3gCa.zmEvJy4F.4C4718b.")
suite "strawboss server":
# suite setup code
let serverProcess = startProcess("./strawboss", ".",
@["serve", "-c", cfgFilePath], loadEnv(), {poUsePath})
let http = newHttpClient()
# give the server time to spin up
sleep(100)
## UNIT TESTS
test "validate hashed pwd":
check validatePwd(testuser, "password")
test "detect invalid pwds":
check(not validatePwd(testuser, "Password"))
test "make and extract a JWT token from a session":
let session = newSession(testuser)
let tok = toJWT(cfg, session)
check fromJWT(cfg, tok) == session
test "version":
let resp = http.get(apiBase & "/version")
check:
resp.status.startsWith("200")
resp.body == "\"strawboss v" & SB_VERSION & "\""
test "fail auth":
let resp = http.post(apiBase & "/auth-token",
$(%*{"username": "bob@builder.com", "password": "notpassword"}))
check resp.status.startsWith("401")
test "auth":
let resp = http.post(apiBase & "/auth-token",
$(%*{"username": "bob@builder.com", "password": "password"}))
check resp.status.startsWith("200")
test "verify valid auth token":
let authHttp = newAuthenticatedHttpClient(apiBase, "bob@builder.com", "password")
let resp = authHttp.get(apiBase & "/verify-auth")
check resp.status.startsWith("200")
test "verify fails when no auth token is given":
let resp = http.get(apiBase & "/verify-auth")
check resp.status.startsWith("401")
test "verify fails when invalid auth token is given":
let http1 = newHttpClient()
http1.headers = newHttpHeaders({"Authorization": "Bearer nope"})
let resp = http1.get(apiBase & "/verify-auth")
check resp.status.startsWith("401")
test "fail to get projects when not authenticated":
let resp = http.get(apiBase & "/projects")
check resp.status.startsWith("401")
test "get projects":
let authHttp = newAuthenticatedHttpClient(apiBase, "bob@builder.com", "password")
let resp = authHttp.get(apiBase & "/projects")
check resp.status.startsWith("200")
let projects: seq[ProjectDef] = parseJson(resp.body).getElems.mapIt(parseProjectDef(it))
check sameContents(projects, cfg.projects)
# suite tear-down
# give the server time to spin down but kill it after that
discard newAsyncHttpClient().post(apiBase & "/service/debug/stop")
sleep(100)
if serverProcess.running: kill(serverProcess)

1
src/test/test-project Submodule

@ -0,0 +1 @@
Subproject commit ab883bd9602a1373347a23c8bee4ed28dd475aec

Binary file not shown.

20
src/util/bash/client.sh Executable file
View File

@ -0,0 +1,20 @@
#!/bin/bash
host="${STRAWBOSS_HOST:-localhost:8180}"
if [ $# -eq 1 ]; then
url="$1"
method="GET"
data=""
elif [ $# -eq 2 ]; then
method="$1"
url="$2"
data=""
else
method="$1"
url="$2"
data="$3"
fi
curl -X "$method" -H "Authorization: Bearer $(cat token.txt)" "http://${host}/api/$url" -d "$data"
echo ""
#echo "curl -X \"$method\" -H \"Authorization: Bearer $(cat token.txt)\" \"localhost:8180/api/$url\" | jq . "

View File

@ -1,7 +1,11 @@
{
"artifactsRepo": "artifacts",
"buildDataDir": "build-data",
"debug": true,
"users": [],
"tokens": [],
"authSecret": "change me",
"pwdCost": 11,
"maintenancePeriod": 5000,
"logLevel": "info",
"projects": [
{ "name": "new-life-intro-band",
"repo": "/home/jdb/projects/new-life-introductory-band" },

View File

@ -1,7 +1,7 @@
# Package
bin = @["strawboss"]
version = "0.2.0"
version = "0.5.1"
author = "Jonathan Bernard"
description = "My personal continious integration worker."
license = "MIT"
@ -9,5 +9,43 @@ srcDir = "src/main/nim"
# Dependencies
requires @["nim >= 0.16.1", "docopt >= 0.1.0", "tempfile", "jester"]
requires @["nim >= 0.19.0", "docopt >= 0.6.8", "isaac >= 0.1.3", "tempfile", "jester >= 0.4.1", "bcrypt",
"untar", "uuids >= 0.1.10", "jwt"]
# Hacky to point to a specific hash. But there is some bug building in the
# docker image we use to build the project with the next version. It adds an
# ifdef branch to support libssl 1.1 but for some reason that ifdef is set
# wrong and it tries to build against the 1.1 API even though the image only
# has the 1.0 API. I'm crossing my fingers and hoping that our base image
# supports libssl 1.1 before I need to update this library.
#requires "https://github.com/yglukhov/nim-jwt#549aa1eb13b8ddc0c6861d15cc2cc5b52bcbef01"
requires "https://git.jdb-labs.com/jdb/nim-lang-utils.git >= 0.4.0"
requires "https://git.jdb-labs.com/jdb/nim-cli-utils.git >= 0.6.0"
requires "https://git.jdb-labs.com/jdb/nim-time-utils.git >= 0.4.0"
# Tasks
task functest, "Runs the functional test suite.":
exec "nimble build"
exec "nim c -r src/test/nim/run_functional_tests.nim"
task unittest, "Runs the unit test suite.":
exec "nimble build"
exec "nim c -r src/test/nim/run_unit_tests.nim"
task test, "Runs both the unit and functional test suites.":
exec "nimble build"
echo "Building test suites..."
exec "nim c src/test/nim/run_unit_tests.nim"
exec "nim c src/test/nim/run_functional_tests.nim"
echo "\nRunning unit tests."
echo "-------------------"
exec "src/test/nim/run_unit_tests"
echo "\nRunning functional tests."
echo "-------------------------"
exec "src/test/nim/run_functional_tests"
task dist, "Creates distributable package.":
exec "nimble build"
mkdir "dist"
exec "cp strawboss strawboss.config.json example.json dist/."

37
strawboss.projectdef.json Normal file
View File

@ -0,0 +1,37 @@
{
"name": "strawboss",
"containerImage": "nimlang/nim:0.19.0",
"steps": {
"compile": {
"artifacts": ["strawboss"],
"stepCmd": "nimble build"
},
"unittest": {
"depends": ["compile"],
"stepCmd": "/bin/bash",
"cmdInput": [
"cp $compile_DIR/strawboss .",
"nimble install --depsOnly",
"nim c -r src/test/nim/run_unit_tests"
]
},
"functest": {
"depends": ["compile"],
"stepCmd": "/bin/bash",
"cmdInput": [
"cp $compile_DIR/strawboss .",
"nimble install --depsOnly",
"nim c -r src/test/nim/run_functional_tests"
]
},
"build": {
"artifacts": ["strawboss-$VERSION.zip"],
"depends": ["compile", "unittest", "functest"],
"stepCmd": "/bin/bash",
"cmdInput": [
"cp $compile_DIR/strawboss .",
"zip strawboss-$VERSION.zip strawboss strawboss.config.json example.json src/main/systemd/strawboss.service"
]
}
}
}

11
test-spec.txt Normal file
View File

@ -0,0 +1,11 @@
Run a build. Look for:
- Run request archived
- Output logs archived with the run request
- Artifacts archived in the build-data directory.
- Configuration for that version archived in configurations directory.
- Status for that version archived in the status directory
Run the build again for the same project and build ref:
- Build should be skipped.
- Run request should be archived.