Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cloudflare_bot_management resource disabling un-managed feature "Block AI Scrapers and Crawlers" #3673

Open
3 tasks done
pnatel opened this issue Aug 17, 2024 · 3 comments
Open
3 tasks done
Labels
kind/bug Categorizes issue or PR as related to a bug. service/bot_management Categorizes issue or PR as related to the Bot Management service. triage/accepted Indicates an issue or PR is ready to be actively worked on.

Comments

@pnatel
Copy link

pnatel commented Aug 17, 2024

Confirmation

  • This is a bug with an existing resource and is not a feature request or enhancement. Feature requests should be submitted with Cloudflare Support or your account team.
  • I have searched the issue tracker and my issue isn't already found.
  • I have replicated my issue using the latest version of the provider and it is still present.

Terraform and Cloudflare provider version

Terraform v1.9.2
on darwin_arm64

  • provider registry.terraform.io/cloudflare/cloudflare v4.39.0
  • provider registry.terraform.io/hashicorp/hcp v0.94.1

Affected resource(s)

resource "cloudflare_bot_management"

Terraform configuration files

resource "cloudflare_bot_management" "bot_management" {
  enable_js              = true
  fight_mode             = true
  zone_id                = var.cloudflare_zone_id
  auto_update_model      = true
  suppress_session_score = false
}

Link to debug output

N/A

Panic output

N/A

Expected output

when updating the resource "cloudflare_bot_management", it should only change configuration within its scope.

Actual output

The previously UI activated "Block AI Scrapers and Crawlers" gets disabled after updating the resource "cloudflare_bot_management"
image

Steps to reproduce

  1. Have the resource "cloudflare_bot_management" previously applied
  2. manually enable the "Block AI Scrapers and Crawlers" option (There is no resource option to enable this via Terraform)
  3. apply a change in the resource "cloudflare_bot_management" (update any optional item)
  4. Check that the "Block AI Scrapers and Crawlers" option is now disabled (unwanted)

Additional factoids

I believe the "Block AI Scrapers and Crawlers" option should be managed by the resource "cloudflare_bot_management", however, if that is not the case, then it should not interfere with the option enabled in the UI.

References

No response

@pnatel pnatel added kind/bug Categorizes issue or PR as related to a bug. needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Aug 17, 2024
Copy link
Contributor

Community Note

Voting for Prioritization

  • Please vote on this issue by adding a 👍 reaction to the original post to help the community and maintainers prioritize this request.
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request.

Volunteering to Work on This Issue

  • If you are interested in working on this issue, please leave a comment.
  • If this would be your first contribution, please review the contribution guide.

Copy link
Contributor

Thank you for reporting this issue! For maintainers to dig into issues it is required that all issues include the entirety of TF_LOG=DEBUG output to be provided. The only parts that should be redacted are your user credentials in the X-Auth-Key, X-Auth-Email and Authorization HTTP headers. Details such as zone or account identifiers are not considered sensitive but can be redacted if you are very cautious. This log file provides additional context from Terraform, the provider and the Cloudflare API that helps in debugging issues. Without it, maintainers are very limited in what they can do and may hamper diagnosis efforts.

This issue has been marked with triage/needs-information and is unlikely to receive maintainer attention until the log file is provided making this a complete bug report.

1 similar comment
Copy link
Contributor

Thank you for reporting this issue! For maintainers to dig into issues it is required that all issues include the entirety of TF_LOG=DEBUG output to be provided. The only parts that should be redacted are your user credentials in the X-Auth-Key, X-Auth-Email and Authorization HTTP headers. Details such as zone or account identifiers are not considered sensitive but can be redacted if you are very cautious. This log file provides additional context from Terraform, the provider and the Cloudflare API that helps in debugging issues. Without it, maintainers are very limited in what they can do and may hamper diagnosis efforts.

This issue has been marked with triage/needs-information and is unlikely to receive maintainer attention until the log file is provided making this a complete bug report.

@github-actions github-actions bot added triage/needs-information Indicates an issue needs more information in order to work on it. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Aug 17, 2024
@jacobbednarz jacobbednarz added triage/accepted Indicates an issue or PR is ready to be actively worked on. service/bot_management Categorizes issue or PR as related to the Bot Management service. and removed triage/needs-information Indicates an issue needs more information in order to work on it. labels Aug 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug. service/bot_management Categorizes issue or PR as related to the Bot Management service. triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

2 participants