Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data.
Go to file
John Wu 407a649d17 Use ubuntu instead
Signed-off-by: John Wu <webmaster@leapoahead.com>
2015-10-06 11:42:19 -07:00
bin Fix: upload assets url changed 2015-09-20 09:22:41 +03:00
docs Update setup.rst 2015-09-20 12:39:29 +03:00
migrations Make the counter migration safer. 2015-09-09 09:57:54 +03:00
rd_ui Remove imagemin grunt task 2015-09-28 17:14:49 -07:00
redash Add TreasureData query runner 2015-09-21 16:12:34 +09:00
setup Revert TCP listening address 2015-10-06 11:17:48 -07:00
tests Fix: tests w/ celery breaking 2015-09-06 10:15:26 +03:00
.coveragerc Exclude settings.py from coverage report. 2014-02-06 20:55:14 +02:00
.dockerignore Do not ignore .env file 2015-10-05 14:33:20 -07:00
.env.example Remove unused exports for env examples 2015-07-20 12:36:10 -07:00
.gitignore Docker support 2015-09-30 14:19:22 -07:00
.landscape.yaml landscape.io configuration file 2014-10-19 13:41:29 +03:00
circle.yml Move datasource requirements from bootstrap to own requirements file 2015-07-27 11:46:53 -07:00
docker-compose-example.yaml Docker support 2015-09-30 14:19:22 -07:00
Dockerfile Use ubuntu instead 2015-10-06 11:42:19 -07:00
LICENSE Updated README & License file 2013-10-28 15:18:13 +02:00
Makefile Tests for users API 2015-09-06 10:15:25 +03:00
manage.py Feature: alerts for query results. 2015-07-22 17:05:31 +03:00
Procfile Procfile changes: 2014-02-13 20:16:36 +02:00
Procfile.dev Vagrant file to use the redash/dev box 2014-12-30 07:45:30 +02:00
Procfile.heroku fix starting of celery in Heroku 2014-06-24 09:46:40 +08:00
README.md add Hive as datasource 2015-07-29 02:01:22 +03:00
requirements_all_ds.txt Add TreasureData query runner 2015-09-21 16:12:34 +09:00
requirements_dev.txt Move datasource requirements from bootstrap to own requirements file 2015-07-27 11:46:53 -07:00
requirements.txt Move vertica requirement to correct requirements file. 2015-09-20 08:52:33 +03:00
Vagrantfile Vagrant file to use the redash/dev box 2014-12-30 07:45:30 +02:00

re:dash is our take on freeing the data within our company in a way that will better fit our culture and usage patterns.

Prior to re:dash, we tried to use traditional BI suites and discovered a set of bloated, technically challenged and slow tools/flows. What we were looking for was a more hacker'ish way to look at data, so we built one.

re:dash was built to allow fast and easy access to billions of records, that we process and collect using Amazon Redshift ("petabyte scale data warehouse" that "speaks" PostgreSQL). Today re:dash has support for querying multiple databases, including: Redshift, Google BigQuery, PostgreSQL, MySQL, Graphite, Presto, Google Spreadsheets, Cloudera Impala, Hive and custom scripts.

re:dash consists of two parts:

  1. Query Editor: think of JS Fiddle for SQL queries. It's your way to share data in the organization in an open way, by sharing both the dataset and the query that generated it. This way everyone can peer review not only the resulting dataset but also the process that generated it. Also it's possible to fork it and generate new datasets and reach new insights.
  2. Dashboards/Visualizations: once you have a dataset, you can create different visualizations out of it, and then combine several visualizations into a single dashboard. Currently it supports charts, pivot table and cohorts.

re:dash is a work in progress and has its rough edges and way to go to fulfill its full potential. The Query Editor part is quite solid, but the visualizations need more work to enrich them and to make them more user friendly.

Demo

Screenshots

You can try out the demo instance: http://demo.redash.io/ (login with any Google account).

Getting Started

Getting help

Roadmap

TBD.

Reporting Bugs and Contributing Code

  • Want to report a bug or request a feature? Please open an issue.
  • Want to help us build re:dash? Fork the project and make a pull request. We need all the help we can get!

License

See LICENSE file.