-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Export the report in a sonarqube specifc form. #67
Comments
Hey @RalfKow, we were thinking about generating a coverage report in some more standard formats, e.g. Cobertura (#39) or even Sonarqube, but we haven't found time for this so far. However, I believe you'd be interested in generating issues and not coverage reports, right? Do I imagine it correctly that you'd like to have columns with no documentation marked in the SQL file where the model is described? Or did you mean something else by this? |
Hey @sweco, thanks for your reply. In my perspective these are two indipendent things. In general I would love to have similar tools like I have in python in on sql/dbt. Let's take python as an example.
For sql files I basically have
In the end I would like to join the coverage report of python and sql to calculate the coverage of the whole project. |
I see, so you'd like to be able to generate a coverage report in a different format (Sonarqube), right? If that's the case, it might indeed make sense to do. 😊 However, we won't have time to do so in the near future but we're open to contribution. It might be sufficient to add a method that would export the data in the particular format, similarly to dbt-coverage/dbt_coverage/__init__.py Line 353 in 5926212
|
General situation
I work with dbt and python. I use sonar qube to check my python code but I cant check the created sql with dbt. Is there an option to genereate a report according to the sonar qube required standards? https://docs.sonarsource.com/sonarqube/9.9/analyzing-source-code/importing-external-issues/generic-issue-import-format/
The text was updated successfully, but these errors were encountered: