* Migrate all mysql tests to the new form
* Only dump sql if MYSQL_TEST is on
* Removing parallel until we get rid of this code
* Move TestMain to an actual _test file
* A little experiment with tmpfs to speed up the db
* Let's make sure the dump.sql file is also in ram
- Update names/roles of users in `make e2e-setup`.
- Update test SSO user info.
- Add Cypress commands for seeding users/Teams.
- Stub Cypress tests for team/tier matrix.
Instead of synchronously updating the seen_time column for a host on an update, batch these updates to be written together every 1 second.
This results in a ~33% reduction in MySQL CPU usage in a local test with 4,000 simulated hosts and MySQL running in Docker.
Improves MySQL test time (on my 2020 MBP) to ~18s from ~125s.
- Use separate databases for each test to allow parallelization.
- Run migrations only once at beginning of tests and then reload
generated schema.
- Add `--innodb-file-per-table=OFF` for ~20% additional speedup.
- Add --dev flag that will set default flag values. This simplifies the
invocation of Fleet in a development environment.
- Change defaults in docker-compose to use `fleet` in place of `kolide`.
- Skip prompt in `prepare db` when `--dev` specified.
- Update developer documentation.
Updates to MySQL configuration in docker-compose.yml may require
existing development containers and volumes to be deleted (this will
delete data in MySQL):
```shell
docker-compose rm -sf
docker volume rm fleet_mysql-persistent-volume
```
Closes#170
* Use YAML anchors to avoid repeating config blocks
* Use docker volumes to persist data for mysql
* Allow setting `FLEET_SERVER` (fixes#2127) when using the docker-compose file to spin up multiple osquery clients
A new datastore interface is needed for buffering incoming distributed query results to be sent to the client. This PR attempts to define and implement that interface.
It is intended that the ReadChannel() method be used by the goroutine that will push query results down a websocket to the client. Passing the results through this channel will allow that goroutine to perform a select on both the channel and the websocket, in order to properly handle IO.
* No more hard deletes
* scaffolding for password reset endpoint
* Ensure password reset state is accounted for in VC checks
* password reset endpoints and data structures
* ability to change password with reset token
* smtp server connection pool management
* stubbing out the sending of the email
* adding mailhog via docker
* HTML emails with confgurable host name
* fixing typo in the comments
* Fixing merge which undid DatabaseError replacement
* documentation in the readme
* webpack shortcut for components
* removing a sneaky merge line that snuck in
* temporary email content api
* tests for password reset flow
* fixing go vet
* comments and making all db use `&value` rather than `reference`
* more correct usage of the errors library and moving email sending to it's own method
* using the wrong error
* fixing email mock object error
* less incorrect error usage
* rebasing and merging
* http constants for status code
* using ParseAndValidateJSON instead of BindJSON
* validate instead of binding in struct tags
* NewFromError instead of New
This commit vendors in all of our dependencies using
[GoDep](https://github.com/tools/godep). We are forgoing using a vendor/ folder to avoid checking in deps into the repo.
Note: Never manually modify `Godeps/Godeps.json` this file is dynamically
by the godep CLI
Common Actions:
To add a new package foo/bar, do this:
1. Run `go get foo/bar`
1. Edit your code to import foo/bar.
1. Run `godep save` (or `godep save ./...`).
To update a package from your `$GOPATH`, do this:
1. Run `go get -u foo/bar`
1. Run `godep update foo/bar`. (You can use the `...` wildcard, for example
`godep update foo/...`).
This commit adds both a Dockerfile and updates the docker-compose.yml with local mounting so that you can standup a consistent dev environment. Please view the project README for more information.