Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finalize second batch of Resources tests #83

Merged
merged 2 commits into from
Feb 28, 2018
Merged

Conversation

seav
Copy link
Contributor

@seav seav commented Feb 17, 2018

Proposed changes in this pull request

Add automated functional tests for the remaining Resources test cases as described in the Functional Test Cases spreadsheet:

  • Update package version to 0.6.0.
  • Add a resources util "mixin" class to hold common util methods.
  • Add 8 tests that collectively verify 18 test cases.
  • Update 1 existing test to better match its test script.
  • Add 3 test files for spatial resources.
  • Add 11 test fixtures for dummy resources for test case #A26.
  • Delete obsolete tests and test methods from Outreachy project.

When should this PR be merged

When convenient.

Risks

No risks foreseen. This PR should only affect QA and not the actual platform function.

Follow-up actions

Release then update cadasta-platform.

Checklist (for reviewing)

General

  • Is this PR explained thoroughly? All code changes must be accounted for in the PR description.
  • Is the PR labeled correctly? It should have the migration label if a new migration is added.
  • Is the risk level assessment sufficient? The risks section should contain all risks that might be introduced with the PR and which actions we need to take to mitigate these risks. Possible risks are database migrations, new libraries that need to be installed or changes to deployment scripts.

Functionality

  • Are all requirements met? Compare implemented functionality with the requirements specification.
  • Does the UI work as expected? There should be no Javascript errors in the console; all resources should load. There should be no unexpected errors. Deliberately try to break the feature to find out if there are corner cases that are not handled.

Code

  • Do you fully understand the introduced changes to the code? If not ask for clarification, it might uncover ways to solve a problem in a more elegant and efficient way.
  • Does the PR introduce any inefficient database requests? Use the debug server to check for duplicate requests.
  • Are all necessary strings marked for translation? All strings that are exposed to users via the UI must be marked for translation.

Tests

  • Are there sufficient test cases? Ensure that all components are tested individually; models, forms, and serializers should be tested in isolation even if a test for a view covers these components.
  • If this is a bug fix, are tests for the issue in place? There must be a test case for the bug to ensure the issue won’t regress. Make sure that the tests break without the new code to fix the issue.
  • If this is a new feature or a significant change to an existing feature? has the manual testing spreadsheet been updated with instructions for manual testing?

Security

  • Confirm this PR doesn't commit any keys, passwords, tokens, usernames, or other secrets.
  • Are all UI and API inputs run through forms or serializers?
  • Are all external inputs validated and sanitized appropriately?
  • Does all branching logic have a default case?
  • Does this solution handle outliers and edge cases gracefully?
  • Are all external communications secured and restricted to SSL?

Documentation

  • Are changes to the UI documented in the platform docs? If this PR introduces new platform site functionality or changes existing ones, the changes must be documented in the Cadasta Platform Documentation.
  • Are changes to the API documented in the API docs? If this PR introduces new API functionality or changes existing ones, the changes must be documented in the API docs.
  • Are reusable components documented? If this PR introduces components that are relevant to other developers (for instance a mixin for a view or a generic form) they should be documented in the Wiki.

Copy link
Member

@oliverroick oliverroick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One tests fails consistently using CADASTA_TEST_WEBDRIVER=Firefox

@@ -566,6 +851,66 @@ def test_file_with_nonacceptable_mime_type_cannot_be_uploaded(
'//*[contains(@class, "file-well")]'
'//*[normalize-space()="File type not allowed."]')

@pytest.mark.uploads
def test_multiple_existing_resources_from_different_pages_can_be_attached(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test fails with selenium.common.exceptions.WebDriverException: Message: Failed to decode response from marionette.

Following that there's another error during tear down (ERROR at teardown of TestAttaching.test_multiple_existing_resources_from_different_pages_can_be_attached): selenium.common.exceptions.SessionNotCreatedException: Message: Tried to run command without establishing a connection

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be solved by increasing the VM's RAM to 3GB. When I'm running the test with the default 2GB, I consistently get the WebDriverException error. This goes away when I increased the size to 3GB. I noticed this because the test didn't fail when run on Travis and this is the longest test we have (so far) because of the need to set up and then clean up 11 resource files, which implies that the test needs more computing resources. I have no idea why there is no out-of-memory error message or something that appears.

Copy link
Member

@oliverroick oliverroick Feb 21, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed in our call, try to use fixtures instead of uploading all the files.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've add a commit to the branch to remove the 11 dummy resources and replace them with 11 test fixture objects of model resources.resource and updated the test to match.

@oliverroick oliverroick merged commit f78a33d into master Feb 28, 2018
@oliverroick oliverroick deleted the resources-batch2 branch February 28, 2018 20:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

2 participants