Professional Data Engineer Exam QuestionsBrowse all questions from this exam

Professional Data Engineer Exam - Question 194


An online brokerage company requires a high volume trade processing architecture. You need to create a secure queuing system that triggers jobs. The jobs will run in Google Cloud and call the company's Python API to execute trades. You need to efficiently implement a solution. What should you do?

Show Answer
Correct Answer: AD

To design a secure, efficient, and scalable queuing system that triggers jobs which call the company's Python API to execute trades in Google Cloud, you can use a Pub/Sub push subscription to trigger a Cloud Function. This setup allows for high volume message processing, immediate execution of trades upon receiving market data, and seamless integration with the Python API. This method ensures low latency, scalability, and cost optimization by leveraging Google Cloud's serverless computing features.

Discussion

15 comments
Sign in to comment
lucaluca1982Option: A
Mar 26, 2023

A and D are both good. I go for A because we have high volume and easy to scale and optmize cost

TNT87Option: A
Sep 8, 2022

Ans A https://cloud.google.com/functions/docs/calling/pubsub#deployment

GCPCloudArchitectUserOption: A
Nov 10, 2022

Because trading platform requires securely transmission to queuing If you use cloud compose then we need some other job to trigger composer … would that be cloud composer api or cloud function …

musumusuOption: A
Feb 19, 2023

Answer A: assume, Company wants to buy immediately in same second if stock goes down or up. Somehow, it is connected to PubSub as SINK connector, then immediately there is PUSH to subcriber (cloud function) that is connected to their python API (internal application) that makes the purchase.

AWSandeepOption: A
Sep 3, 2022

A. Use a Pub/Sub push subscription to trigger a Cloud Function to pass the data to the Python API.

zellckOption: A
Nov 28, 2022

A is the answer.

squishy_fishyOption: D
Oct 23, 2023

Answer is D, at work we use solution A for low volume of Pub/Sub messages and Cloud function, and using D Composer for high volume Pub/Sub messages.

soichirokawaOption: A
Sep 4, 2022

A might be enough. Cloud composer will be an overkill

YorelNationOption: A
Sep 6, 2022

A because D is stupidly high latency

kajitsuOption: D
Jul 1, 2024

D is the answer.

PhuocTOption: A
Sep 2, 2022

A. more sense to me.

ducc
Sep 3, 2022

Composer support exception and retry for complex pipeline. D might be correct

duccOption: D
Sep 3, 2022

D is a more recommend way by Google, IMO.

squishy_fishy
Oct 23, 2023

I agree, at work use solution A for low volume of Pub/Sub messages and function, and using Composer for high volume Pub/Sub messages.

nwkOption: A
Sep 5, 2022

Vote A, can't see the need for composer

TNT87
Sep 29, 2022

https://cloud.google.com/functions/docs/calling/pubsub

AzureDP900Option: A
Jan 2, 2023

A. Use a Pub/Sub push subscription to trigger a Cloud Function to pass the data to the Python API.