Skip to content

Commit

Permalink
Fix make test for running integration tests locally (dbt-labs#591)
Browse files Browse the repository at this point in the history
* Remove venv from CircleCI

* Utilize latest minor version of Python 3.9 for CircleCI (rather than pinned patch version)

* Align local environment variables with CircleCI

* Ignore changes related to running integration tests

* Move the make file to the project root

* Refactor make commands to run integration tests

* Update instructions for running tests

* Implementation guidelines

* Switch order of testing all models vs. a single model in the instructions
  • Loading branch information
dbeatty10 authored Jun 13, 2022
1 parent 471a838 commit 67ef54b
Show file tree
Hide file tree
Showing 13 changed files with 175 additions and 112 deletions.
12 changes: 8 additions & 4 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ jobs:

integration-postgres:
docker:
- image: cimg/python:3.9.9
- image: cimg/python:3.9
- image: cimg/postgres:9.6
environment:
POSTGRES_USER: root
Expand All @@ -18,6 +18,7 @@ jobs:

steps:
- checkout
- run: pip install --pre dbt-postgres -r dev-requirements.txt
- run:
name: "Run Functional Tests - Postgres"
command: ./run_functional_test.sh postgres
Expand All @@ -29,9 +30,10 @@ jobs:

integration-redshift:
docker:
- image: cimg/python:3.9.9
- image: cimg/python:3.9
steps:
- checkout
- run: pip install --pre dbt-redshift -r dev-requirements.txt
- run:
name: "Run Functional Tests - Redshift"
command: ./run_functional_test.sh redshift
Expand All @@ -43,9 +45,10 @@ jobs:

integration-snowflake:
docker:
- image: cimg/python:3.9.9
- image: cimg/python:3.9
steps:
- checkout
- run: pip install --pre dbt-snowflake -r dev-requirements.txt
- run:
name: "Run Functional Tests - Snowflake"
command: ./run_functional_test.sh snowflake
Expand All @@ -59,9 +62,10 @@ jobs:
environment:
BIGQUERY_SERVICE_KEY_PATH: "/home/circleci/bigquery-service-key.json"
docker:
- image: cimg/python:3.9.9
- image: cimg/python:3.9
steps:
- checkout
- run: pip install --pre dbt-bigquery -r dev-requirements.txt
- run:
name: "Set up credentials"
command: echo $BIGQUERY_SERVICE_ACCOUNT_JSON > ${HOME}/bigquery-service-key.json
Expand Down
25 changes: 11 additions & 14 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,12 @@
`dbt-utils` is open source software. It is what it is today because community members have opened issues, provided feedback, and [contributed to the knowledge loop](https://www.getdbt.com/dbt-labs/values/). Whether you are a seasoned open source contributor or a first-time committer, we welcome and encourage you to contribute code, documentation, ideas, or problem statements to this project.

1. [About this document](#about-this-document)
2. [Getting the code](#getting-the-code)
3. [Setting up an environment](#setting-up-an-environment)
4. [Testing dbt-utils](#testing)
5. [Adding CHANGELOG Entry](#adding-changelog-entry)
6. [Submitting a Pull Request](#submitting-a-pull-request)
1. [Getting the code](#getting-the-code)
1. [Setting up an environment](#setting-up-an-environment)
1. [Implementation guidelines](#implementation-guidelines)
1. [Testing dbt-utils](#testing)
1. [Adding CHANGELOG Entry](#adding-changelog-entry)
1. [Submitting a Pull Request](#submitting-a-pull-request)

## About this document

Expand Down Expand Up @@ -52,16 +53,12 @@ These are the tools used in `dbt-utils` development and testing:

A deep understanding of these tools in not required to effectively contribute to `dbt-utils`, but we recommend checking out the attached documentation if you're interested in learning more about each one.

#### Virtual environments
## Implementation guidelines

We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
```sh
python3 -m venv env
source env/bin/activate
```

This will create and activate a new Python virtual environment.
Ensure that changes will work on "non-core" adapters by:
- dispatching any new macro(s) so non-core adapters can also use them (e.g. [the `star()` source](https://github.com/fishtown-analytics/dbt-utils/blob/master/macros/sql/star.sql))
- using the `limit_zero()` macro in place of the literal string: `limit 0`
- using `dbt_utils.type_*` macros instead of explicit datatypes (e.g. `dbt_utils.type_timestamp()` instead of `TIMESTAMP`

## Testing

Expand Down
24 changes: 24 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
.DEFAULT_GOAL:=help

.PHONY: test
test: ## Run the integration tests.
@./run_test.sh $(target) $(models) $(seeds)

.PHONY: dev
dev: ## Installs dbt-* packages in develop mode along with development dependencies.
@\
echo "Install dbt-$(target)..."; \
pip install --upgrade pip setuptools; \
pip install --pre "dbt-$(target)" -r dev-requirements.txt;

.PHONY: setup-db
setup-db: ## Setup Postgres database with docker-compose for system testing.
@\
docker-compose up --detach postgres

.PHONY: help
help: ## Show this help message.
@echo 'usage: make [target]'
@echo
@echo 'targets:'
@grep -E '^[8+a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
30 changes: 3 additions & 27 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,32 +1,8 @@
version: "3.7"
services:

dbt:
image: circleci/python:3.6.3-stretch
depends_on:
- ${TARGET}
env_file: "./integration_tests/.env/${TARGET}.env"
entrypoint: "/repo/run_test.sh ${TARGET} ${MODELS} ${SEEDS}"
working_dir: /repo
volumes:
- ".:/repo"

postgres:
image: circleci/postgres:9.6.5-alpine-ram
image: cimg/postgres:9.6
environment:
- POSTGRES_USER=root
ports:
- "5432:5432"

# dummy container, since snowflake is a managed service
snowflake:
image: circleci/python:3.6.3-stretch
entrypoint: "/bin/true"

# dummy container, since bigquery is a managed service
bigquery:
image: circleci/python:3.6.3-stretch
entrypoint: "/bin/true"

# dummy container, since redshift is a managed service
redshift:
image: circleci/python:3.6.3-stretch
entrypoint: "/bin/true"
3 changes: 2 additions & 1 deletion integration_tests/.env/bigquery.env
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
GCLOUD_SERVICE_KEY_PATH=
BIGQUERY_SERVICE_KEY_PATH=
BIGQUERY_TEST_DATABASE=
10 changes: 5 additions & 5 deletions integration_tests/.env/postgres.env
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
CI_DBT_HOST=postgres
CI_DBT_USER=root
CI_DBT_PASS=''
CI_DBT_PORT=5432
CI_DBT_DBNAME=circle_test
POSTGRES_TEST_HOST=localhost
POSTGRES_TEST_USER=root
POSTGRES_TEST_PASS=''
POSTGRES_TEST_PORT=5432
POSTGRES_TEST_DBNAME=circle_test
9 changes: 5 additions & 4 deletions integration_tests/.env/redshift.env
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
CI_REDSHIFT_DBT_HOST=
CI_REDSHIFT_DBT_USER=
CI_REDSHIFT_DBT_PASS=
CI_REDSHIFT_DBT_DBNAME=
REDSHIFT_TEST_HOST=
REDSHIFT_TEST_USER=
REDSHIFT_TEST_PASS=
REDSHIFT_TEST_DBNAME=
REDSHIFT_TEST_PORT=
12 changes: 6 additions & 6 deletions integration_tests/.env/snowflake.env
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
CI_SNOWFLAKE_DBT_ACCOUNT=
CI_SNOWFLAKE_DBT_USER=
CI_SNOWFLAKE_DBT_PASS=
CI_SNOWFLAKE_DBT_ROLE=
CI_SNOWFLAKE_DBT_DATABASE=
CI_SNOWFLAKE_DBT_WAREHOUSE=
SNOWFLAKE_TEST_ACCOUNT=
SNOWFLAKE_TEST_USER=
SNOWFLAKE_TEST_PASSWORD=
SNOWFLAKE_TEST_ROLE=
SNOWFLAKE_TEST_DATABASE=
SNOWFLAKE_TEST_WAREHOUSE=
2 changes: 2 additions & 0 deletions integration_tests/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,5 @@
target/
dbt_modules/
logs/
.env/
profiles.yml
14 changes: 0 additions & 14 deletions integration_tests/Makefile

This file was deleted.

115 changes: 100 additions & 15 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,90 @@
### Run the integration tests
### Overview
1. Prerequisites
1. Configure credentials
1. Setup Postgres (optional)
1. Setup virtual environment
1. Installation for development
1. Run the integration tests
1. Run tests
1. Creating a new integration test

### Prerequisites
- python3
- Docker

### Configure credentials
Edit the env file for your TARGET in `integration_tests/.env/[TARGET].env`.

Load the environment variables:
```shell
set -a; source integration_tests/.env/[TARGET].env; set +a
```

To run the integration tests on your local machine, like they will get run in the CI (using CircleCI), you can do the following:
or more specific:
```shell
set -a; source integration_tests/.env/postgres.env; set +a
```

Assuming you are in the `integration_tests` folder,
#### Setup Postgres (optional)

```bash
make test target=[postgres|redshift|...] [models=...] [seeds=...]
Docker and `docker-compose` are both used in testing. Specific instructions for your OS can be found [here](https://docs.docker.com/get-docker/).

Postgres offers the easiest way to test most `dbt-utils` functionality today. Its tests are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:

```shell
make setup-db
```
or, alternatively:
```shell
docker-compose up --detach postgres
```

### Setup virtual environment

We strongly recommend using virtual environments when developing code in `dbt-utils`. We recommend creating this virtualenv
in the root of the `dbt-utils` repository. To create a new virtualenv, run:
```shell
python3 -m venv env
source env/bin/activate
```

This will create and activate a new Python virtual environment.

### Installation for development

First make sure that you set up your virtual environment as described above. Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-core` (and its dependencies) with:

```shell
make dev target=[postgres|redshift|...]
# or
pip install --pre dbt-[postgres|redshift|...] -r dev-requirements.txt
```

or more specific:

```bash
make test target=postgres models=sql.test_star seeds=sql.data_star
```shell
make dev target=postgres
# or
pip install --pre dbt-postgres -r dev-requirements.txt
```

or, to test against all targets:
### Run the integration tests

To run all the integration tests on your local machine like they will get run in the CI (using CircleCI):

```shell
make test target=postgres
```

```bash
make test-all [models=...] [seeds=...]
or, to run tests for a single model:
```shell
make test target=[postgres|redshift|...] [models=...] [seeds=...]
```

or more specific:

```shell
make test target=postgres models=sql.test_star seeds=sql.data_star
```

Specying `models=` and `seeds=` is optional, however _if_ you specify `seeds`, you have to specify `models` too.
Expand All @@ -26,6 +93,17 @@ Where possible, targets are being run in docker containers (this works for Postg

### Creating a new integration test

#### Set up profiles
Do either one of the following:
1. Use `DBT_PROFILES_DIR`
```shell
cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
export DBT_PROFILES_DIR=$(cd integration_tests && pwd)
```
2. Use `~/.dbt/profiles.yml`
- Copy contents from `integration_tests/ci/sample.profiles.yml` into `~/.dbt/profiles.yml`.

#### Add your integration test
This directory contains an example dbt project which tests the macros in the `dbt-utils` package. An integration test typically involves making 1) a new seed file 2) a new model file 3) a generic test to assert anticipated behaviour.

For an example integration tests, check out the tests for the `get_url_parameter` macro:
Expand All @@ -35,13 +113,20 @@ For an example integration tests, check out the tests for the `get_url_parameter
3. [Model to test the macro](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/test_urls.sql)
4. [A generic test to assert the macro works as expected](https://github.com/fishtown-analytics/dbt-utils/blob/master/integration_tests/models/web/schema.yml#L2)


Once you've added all of these files, you should be able to run:
Assuming you are in the `integration_tests` folder,
```shell
dbt deps --target {your_target}
dbt seed --target {your_target}
dbt run --target {your_target} --model {your_model_name}
dbt test --target {your_target} --model {your_model_name}
```
$ dbt deps
$ dbt seed
$ dbt run --model {your_model_name}
$ dbt test --model {your_model_name}
Alternatively:
```shell
dbt deps --target {your_target}
dbt build --target {your_target} --select +{your_model_name}
```
If the tests all pass, then you're good to go! All tests will be run automatically when you create a PR against this repo.
10 changes: 0 additions & 10 deletions run_functional_test.sh
Original file line number Diff line number Diff line change
@@ -1,13 +1,3 @@
#!/bin/bash
VENV="venv/bin/activate"

if [[ ! -f $VENV ]]; then
python3 -m venv venv
. $VENV

pip install --upgrade pip setuptools
pip install --pre "dbt-$1" -r dev-requirements.txt
fi

. $VENV
python3 -m pytest tests/functional --profile $1
Loading

0 comments on commit 67ef54b

Please sign in to comment.