Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate "Robots.txt" module to V4 report #235

Closed
6 tasks
ChristopherJennings opened this issue Apr 25, 2019 · 2 comments
Closed
6 tasks

Migrate "Robots.txt" module to V4 report #235

ChristopherJennings opened this issue Apr 25, 2019 · 2 comments
Assignees
Labels
groomed Well-defined and ready to be worked on help wanted This is an opportunity for anyone to contribute report migration

Comments

@ChristopherJennings
Copy link
Contributor

Overview

We should migrate the Robots.txt module to the new V4 report format.

Definition of done

  • Code and Scripts are ported to KenticoInspector.Reports project in a dedicated folder for the report.
  • Scripts are referenced via constants in a static class.
  • Scripts are refactored to return simple results that are mappable to simple classes.
  • Report logic is covered by unit test for a clean result
  • Report logic is covered by unit tests for all known dirty results
  • Useful, non-specific logic is abstracted to services or helpers.

Note: The Class/Table Validation report (and it's tests) is a good, simple example of the main concepts.

Additional Details

  • New name: Robots.txt Configuration Summary
  • Tags: Information, SEO
@ChristopherJennings ChristopherJennings added help wanted This is an opportunity for anyone to contribute groomed Well-defined and ready to be worked on report migration labels Apr 25, 2019
@ChristopherJennings ChristopherJennings added this to the 4.0 milestone Apr 25, 2019
@kentico-anthonym
Copy link
Member

Looking for guidance on how to handle checking the robots.txt report for mvc sites specifically.
Are the end users expected to keep two connections saved one for admin and another for mvc, or should we handle that in the report.cs file?

The InstanceService.GetInstanceDetails method returns a InstanceDetails object with a Sites property that contains site class objects with the necessary information for targeting a site's presentation url if one exists. My question is whether this should be handled in code automatically or the should customer expect to configure KInspector to the correct site url for this particular report?

For the first scenario a few assumptions have to be made:

  • We would make the arbitrary decision of choosing a specific site (if more than one exists) to request the robots.txt file.
  • If the site is an mvc site,assume they are handling request to a /robots.txt endpoint.

For the second scenario it would require that end-users specify which site url they will be accessing, aside from the admin url.

@ChristopherJennings
Copy link
Contributor Author

Right now, let's just assume that one connection equals one site. If they are connecting to an MVC site, I would expect them to enter that URL as the site URL.

Ultimately, we should enable a single connection for multi-site and MVC scenarios, but we'll have to extend the connection and/or report logic in the future to better support multi-instance and MVC scenarios.

For now though, I'd just check the for the robots.txt on the provided site URL.

At some point in the future we'll want to handle this cleaner. I've started a new issue to track this (#259).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
groomed Well-defined and ready to be worked on help wanted This is an opportunity for anyone to contribute report migration
Projects
None yet
Development

No branches or pull requests

2 participants