Data Flare

Data Flare

  • Docs
  • API

›Developer docs

Getting Started

  • Introduction
  • Writing your first suite of checks
  • Supported Scala and Spark versions

Available Checks

    Metrics based checks

    • Introduction to metrics and metric based checks
    • Metrics based checks on a single Dataset
    • Metrics based checks on a pair of Datasets
    • Track metrics not involved in any checks
    • Available metrics
  • Arbitrary checks

Persisting your results

  • Persisting results from your checks
  • Persisting metrics over time

Developer docs

  • Developer documentation

Developer documentation

Running the tests

  1. Spin up an ElasticSearch instance locally, for example with docker you could do:
docker run -p 9200:9200 -p 9300:9300 -e discovery.type=single-node elasticsearch:7.1.0
  1. Run the tests as usual with sbt
sbt test

Published with SBT Sonatype

https://github.com/xerial/sbt-sonatype

To publish a new version do the following in an sbt shell:

python3 mini-cross-build.py

Documentation creation and publishing

Sources for documentation are in the docs-sources folder.

To update the documentation from the docs-sources folder run:

sbt docs/mdoc

To update API docs run:

sbt docs/unidoc

To run the documentation site locally run:

cd website && yarn start

This will check that the Scala code compiles and make any required variable substitutions.

Changes are automatically published to Github Pages when code is merged to master. However if you wish to publish to Github Pages locally then run:

cd website && GITHUB_USER=xxxx CURRENT_BRANCH=xxxx USE_SSH=true yarn run publish-gh-pages
← Persisting metrics over time
  • Running the tests
  • Published with SBT Sonatype
  • Documentation creation and publishing
Data Flare
Docs
Getting StartedAPI Reference
Community
Stack Overflow
More
GitHubStar