Exam CAS-004 All QuestionsBrowse all questions from this exam
Question 106

A development team created a mobile application that contacts a company's back-end APIs housed in a PaaS environment. The APIs have been experiencing high processor utilization due to scraping activities. The security engineer needs to recommend a solution that will prevent and remedy the behavior.

Which of the following would BEST safeguard the APIs? (Choose two.)

    Correct Answer: A, E

    To safeguard the APIs from scraping activities that lead to high processor utilization, two effective measures can be implemented. Bot protection helps in detecting and blocking automated bots that attempt to scrape data, thereby reducing unnecessary load on the APIs. Rate limiting controls the number of requests allowed from a single client within a specific timeframe, which restricts excessive scraping activities and mitigates high processor utilization. These two measures directly address the underlying problem of scraping activities.

Discussion
RevZig67Options: AE

I think the answer is A, E. Bot protection and rate limiting is used to avoid scrapers

Dassler

A for sure, this website uses a bot protection (captcha) to stop web crawlers. I am more inclined towards D instead of E but its possible.

Mr_BuCk3th34DOptions: AE

Although I might agree that OAuth 2.0 could be an answer as well, since it can help with rate limiting by accepting only authorized traffic, this is not as specific as it should be for the proposed scenario. Bot protection is a security measure that helps prevent automated scraping activities by detecting and blocking malicious bots that attempt to access the APIs. This can help reduce the processor utilization on the APIs and prevent scraping activities from affecting the performance of the system. Rate limiting is a security measure that limits the number of requests that can be made to an API within a given time period. By implementing rate limiting, the security engineer can help prevent scraping activities that may cause high processor utilization on the APIs.

BiteSizeOptions: AE

Bot protection and rate limiting will ensure that Bots will not be able to leverage the API's via scraping, and also rate limiting will aid with ensuring the API's are within normal use Cross site request forgery will not mitigate API's Input validation doesn't deal with API's Autoscaling endpoints have to do with the phones, not the servers. Also, that would give the scrapers MORE resources not prevent it. Source: Verifying each answer against Chat GPT, my experience, other test banks, a written book, and weighing in the discussion from all users to create a 100% accurate guide for myself before I take the exam. (It isn't easy because of the time needed, but it is doing my diligence)

cyspecOptions: AE

Key word is scraping. Autoscaling doesn't address the problem that is there is unauthorised usage.

fb2fcb1Options: AE

A. Bot protection E. Rate limiting High processor utilization due to scraping activities suggests that the APIs are being accessed more frequently than normal, potentially by bots. Bot protection (A) can help identify and block malicious bots, thus protecting the APIs from excessive use and scraping activities. Rate limiting (E) is another technique that can help in this scenario. It involves limiting the number of API requests from a particular client within a specified time period, thus preventing any single client from overloading the APIs. While OAuth 2.0 (B) is a protocol for authorization and CSRF protection (F) protects against cross-site request forgery attacks, they aren't specifically useful in mitigating API scraping and high processor utilization. Input validation (C) is always a good practice, but it doesn't necessarily protect against the described problem. Autoscaling endpoints (D) could help manage load, but this does not stop the root cause of the problem, which is the scraping activities.

kycuguOptions: AE

A. Bot protection E. Rate limiting why ? Bot protection and rate limiting are security measures that can help to prevent and reduce the scraping activities. OAuth 2.0 is an authentication protocol, input validation is a process of validating user input, autoscaling endpoints is a process of scaling resources up and down, and CSRF protection is a technique for preventing cross-site request forgery.

dangerelchuloOptions: DE

rate limiting is already preventing now you need something to remediate in case they find a way around so Autoscaling remediates the issues Autoscaling is an out-of-the-box feature that monitors your workloads and dynamically adjusts the capacity to maintain steady and predictable performance at the possible lowest cos

23169fdOptions: AE

A. Bot protection: To detect and block automated scraping activities. E. Rate limiting: To control the number of requests made to the API, thus preventing abuse and managing traffic effectively.

tirajvidOptions: DE

Guys theres 2 points. Detect and then remedy. So it has to be D. Autoscaling endpoints E. Rate limiting Most Voted

carrotpieOptions: BE

A, B and E are all valid IMO, at least according to this post: https://thysniu.medium.com/protecting-your-apis-from-web-scrapers-9f645a865732 The problem with A is that the naming is not correct. There is no such term "Bot protection" in comptia's official resources. But there is a mention of OAuth in the chapter about API's. So I'd go for B,E

kycuguOptions: DE

o safeguard the APIs from scraping activities and prevent high processor utilization, the following options would be the most effective: E: Rate limiting - This involves setting a maximum number of requests that can be made to the API within a certain time period. This can help prevent scraping activities that involve making a large number of requests in a short period of time. D: Autoscaling endpoints - This involves dynamically adjusting the number of instances of the API based on the workload, so that the API can handle a high volume of requests without experiencing high processor utilization.

EZPASSOptions: AE

I'm also leaning towards AE

[Removed]Options: AE

A and E make the most sense since you want to stop bots from using your API and you want to limit the rate of access per customer.

[Removed]

the question also wants you to prevent and remedy the behavior. taking that into consideration the answer could easily be D, E or A, E

Protocol0

The problem I have with D is that you may not need the additional scaling if you get the scraping under control. It's not a solution to just throw more resources without addressing the underlying problem.

beanbagOptions: AB

i suppose the high resource utility is as a result of the scraping activities from web crawlers. Once BOT protection (A) is in place, resource utilization will be managed...I would then choose B (OAuth 2.0)...authentication to only process authenticated traffic

AlexJacobsonOptions: BE

What do you do to prevent bots from scraping publicly accessible APIs? How to prevent scraping Monitor your logs & traffic patterns; limit access if you see unusual activity: ... Require registration & login. ... Block access from cloud hosting and scraping service IP addresses. ... Make your error message nondescript if you do block.