Professional Cloud Architect Exam QuestionsBrowse all questions from this exam

Professional Cloud Architect Exam - Question 15


Your application needs to process credit card transactions. You want the smallest scope of Payment Card Industry (PCI) compliance without compromising the ability to analyze transactional data and trends relating to which payment methods are used.

How should you design your architecture?

Show Answer
Correct Answer: A

To minimize the scope of PCI compliance while still allowing for the analysis of transactional data and trends, creating a tokenizer service and storing only tokenized data is the most effective approach. Tokenization replaces sensitive data such as credit card numbers with unique tokens that do not carry any exploitable value if breached. This reduces the scope of PCI compliance to just the tokenization service, thereby limiting the need to safeguard the entire system. Additionally, the tokenized data can still be used for analysis in terms of transaction trends and payment methods without compromising security.

Discussion

17 comments
Sign in to comment
AD2AD4Option: A
May 28, 2020

Final Decision to go with Option A. I have done PCI DSS Audit for my project and thats the best suited case. 100% sure to use tokenised data instead of actual card number

Musk
Jun 13, 2020

But with A you cannot extract statistics. That is the second r4equirement.

Musk
Jul 31, 2020

Thinking about that better, I think you can because you are only tokenizing the sensitive data, not the transaction type.

Arimaverick
Jan 6, 2021

Analyzing Transaction does not require Credit Card number I guess. Only amount of transaction or balance what is needed. We also perform something similar with transactional data with tokenized PII information. So CC can be tokenized. So answer should be A.

RitwickKumar
Aug 17, 2022

You can as the generated token for a given credit card would be same(generally but there are approaches which can give you different token for the same sensitive data input). Only thing that you won't know is the actual card number which is not required for the trend analysis. When the trend analysis involves referential integrity then tokenization process becomes challenging but still once data is tokenized correctly you should be able to perform any kind of the analysis.

AzureDP900
Oct 16, 2022

I agree. A is the best option

omermahgoubOption: A
Dec 20, 2022

To minimize the scope of Payment Card Industry (PCI) compliance while still allowing for the analysis of transactional data and trends related to payment methods, you should consider using a tokenizer service and storing only tokenized data, as described in option A. Tokenization is a process of replacing sensitive data, such as credit card numbers, with unique, randomly-generated tokens that cannot be used for fraudulent purposes. By using a tokenizer service and storing only tokenized data, you can reduce the scope of PCI compliance to only the tokenization service, rather than the entire application. This can help minimize the amount of sensitive data that needs to be protected and reduce the overall compliance burden.

oxfordcommaa
Jan 21, 2023

man, this is an amazing answer. props

ccpmad
Jun 9, 2024

thanks to chagpt, are you serious saying that?

Saxena_Vibhor
Dec 29, 2023

Nicely explained, thanks.

KjChenOption: A
Nov 10, 2022

https://cloud.google.com/architecture/tokenizing-sensitive-cardholder-data-for-pci-dss

vincy2202Option: A
Dec 24, 2021

A is the correct answer https://cloud.google.com/architecture/tokenizing-sensitive-cardholder-data-for-pci-dss#a_service_for_handling_sensitive_information

haroldbenitesOption: A
Dec 3, 2021

Go for A

holerinaOption: A
Sep 20, 2022

correct answer is A use tokenize

sjmsummerOption: A
Jan 15, 2022

I chose A. But why C is not good?

ryziorOption: A
Apr 3, 2022

I think it should be A and C - the paper states clearly, a proper network segmentation is still required to disparate the vault and token servers from the rest of the flat network.

NircaOption: A
Apr 20, 2022

The mandatory stage in PCI is having a encryption/ description system. Data must not be stored as is with PAN. So A IS A MUST. The rest are nice to have.

abirroyOption: A
Sep 15, 2022

Correct answer A

BiddlyBdoyngOption: A
Sep 25, 2022

B appears the most thorough but the question asks to comply with the smallest scope, network segmentation is not a must. Tokenization is simpler. C is similar to B, more than required. D & E do not address the problem.

minmin2020Option: A
Oct 13, 2022

A. Create a tokenizer service and store only tokenized data

andreavaleOption: A
Nov 5, 2022

ok for A

nocrushOption: A
Jul 26, 2023

A is my best option

heretolearnazureOption: A
Aug 22, 2023

Tokenizing is the best way to protect PCI information.

devinssOption: B
Sep 1, 2023

Not sure why 100% agree on A. To limit PCI DSS scope, the data handling should be done in a separate project with very limited access. Only in this project should tokenization be done and made available for analytics. The first requirement however, is isolating the payment and tokenization code in a separate project. Answer should be B.

hzaouiOption: A
Jan 12, 2024

A is correct