Summer Special Flat 65% Limited Time Discount offer - Ends in 0d 00h 00m 00s - Coupon code: suredis

Google Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer Exam Practice Test

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization’s risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.

What solution would help meet the requirements?

Options:

A.

Ensure that firewall rules are in place to meet the required controls.

B.

Set up Cloud Armor to ensure that network security controls can be managed for G Suite.

C.

Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.

D.

Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

Question 2

Your company’s detection and response team requires break-glass access to the Google Cloud organization in the event of a security investigation. At the end of each day, all security group membership is removed. You need to automate user provisioning to a Cloud Identity security group. You have created a service account to provision group memberships. Your solution must follow Google-recommended practices and comply with the principle of least privilege. What should you do?

Options:

A.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation, and use a service account key.

B.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation. Use Application Default Credentials with the resource-attached service account.

C.

In Google Workspace, grant the Groups Editor role to the service account. Enable the Cloud Identity API. Use a service account key.

D.

In Google Workspace, grant the Groups Editor role to the service account, enable the Cloud Identity API, and use Application Default Credentials with the resource-attached service account.

Question 3

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question 4

You run applications on Cloud Run. You already enabled container analysis for vulnerability scanning. However, you are concerned about the lack of control on the applications that are deployed. You must ensure that only trusted container images are deployed on Cloud Run.

What should you do?

Choose 2 answers

Options:

A.

Enable Binary Authorization on the existing Kubernetes cluster.

B.

Set the organization policy constraint constraints/run. allowedBinaryAuthorizationPolicie to

the list of allowed Binary Authorization policy names.

C.

Set the organization policy constraint constraints/compute.trustedimageProjects to the list of

protects that contain the trusted container images.

D.

Enable Binary Authorization on the existing Cloud Run service.

E.

Use Cloud Run breakglass to deploy an image that meets the Binary Authorization policy by default.

Question 5

A large financial institution is moving its Big Data analytics to Google Cloud Platform. They want to have maximum control over the encryption process of data stored at rest in BigQuery.

What technique should the institution use?

Options:

A.

Use Cloud Storage as a federated Data Source.

B.

Use a Cloud Hardware Security Module (Cloud HSM).

C.

Customer-managed encryption keys (CMEK).

D.

Customer-supplied encryption keys (CSEK).

Question 6

Which Google Cloud service should you use to enforce access control policies for applications and resources?

Options:

A.

Identity-Aware Proxy

B.

Cloud NAT

C.

Google Cloud Armor

D.

Shielded VMs

Question 7

A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.

Which service should be used to accomplish this?

Options:

A.

Cloud Armor

B.

Google Cloud Audit Logs

C.

Cloud Security Scanner

D.

Forseti Security

Question 8

Your organization is rolling out a new continuous integration and delivery (CI/CD) process to deploy infrastructure and applications in Google Cloud Many teams will use their own instances of the CI/CD workflow It will run on Google Kubernetes Engine (GKE) The CI/CD pipelines must be designed to securely access Google Cloud APIs

What should you do?

Options:

A.

• 1 Create a dedicated service account for the CI/CD pipelines

• 2 Run the deployment pipelines in a dedicated nodes pool in the GKE cluster

• 3 Use the service account that you created as identity for the nodes in the pool to authenticate to the Google Cloud APIs

B.

• 1 Create service accounts for each deployment pipeline

• 2 Generate private keys for the service accounts

• 3 Securely store the private keys as Kubernetes secrets accessible only by the pods that run the specific deploy pipeline

C.

* 1 Create individual service accounts (or each deployment pipeline

• 2 Add an identifier for the pipeline in the service account naming convention

• 3 Ensure each pipeline runs on dedicated pods

• 4 Use workload identity to map a deployment pipeline pod with a service account

D.

• 1 Create two service accounts one for the infrastructure and one for the application deployment

• 2 Use workload identities to let the pods run the two pipelines and authenticate with the service accounts

• 3 Run the infrastructure and application pipelines in separate namespaces

Question 9

An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request.

Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses

Which solution should your team implement to meet these requirements?

Options:

A.

Cloud Armor

B.

Network Load Balancing

C.

SSL Proxy Load Balancing

D.

NAT Gateway

Question 10

An application running on a Compute Engine instance needs to read data from a Cloud Storage bucket. Your team does not allow Cloud Storage buckets to be globally readable and wants to ensure the principle of least privilege.

Which option meets the requirement of your team?

Options:

A.

Create a Cloud Storage ACL that allows read-only access from the Compute Engine instance’s IP address and allows the application to read from the bucket without credentials.

B.

Use a service account with read-only access to the Cloud Storage bucket, and store the credentials to the service account in the config of the application on the Compute Engine instance.

C.

Use a service account with read-only access to the Cloud Storage bucket to retrieve the credentials from the instance metadata.

D.

Encrypt the data in the Cloud Storage bucket using Cloud KMS, and allow the application to decrypt the data with the KMS key.

Question 11

Your company requires the security and network engineering teams to identify all network anomalies within and across VPCs, internal traffic from VMs to VMs, traffic between end locations on the internet and VMs, and traffic between VMs to Google Cloud services in production. Which method should you use?

Options:

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question 12

Your organization uses BigQuery to process highly sensitive, structured datasets. Following the "need to know" principle, you need to create the Identity and Access Management (IAM) design to meet the needs of these users:

• Business user must access curated reports.

• Data engineer: must administrate the data lifecycle in the platform.

• Security operator: must review user activity on the data platform.

What should you do?

Options:

A.

Configure data access log for BigQuery services, and grant Project Viewer role to security operators.

B.

Generate a CSV data file based on the business user's needs, and send the data to their email addresses.

C.

Create curated tables in a separate dataset and assign the role roles/bigquery.dataViewer.

D.

Set row-based access control based on the "region" column, and filter the record from the United States for data engineers.

Question 13

You want to prevent users from accidentally deleting a Shared VPC host project. Which organization-level policy constraint should you enable?

Options:

A.

compute.restrictSharedVpcHostProjects

B.

compute.restrictXpnProjectLienRemoval

C.

compute.restrictSharedVpcSubnetworks

D.

compute.sharedReservationsOwnerProjects

Question 14

You are on your company's development team. You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user's browser in a production environment.

How should you prevent and fix this vulnerability?

Options:

A.

Use Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability.

B.

Set up an HTTPS load balancer, and then use Cloud Armor for the production environment to prevent the potential XSS attack.

C.

Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library.

D.

Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping.

Question 15

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

Options:

A.

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Question 16

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

Options:

A.

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Question 17

You need to use Cloud External Key Manager to create an encryption key to encrypt specific BigQuery data at rest in Google Cloud. Which steps should you do first?

Options:

A.

1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project.

2. Grant your Google Cloud project access to a supported external key management partner system.

B.

1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).

2. In Cloud KMS, grant your Google Cloud project access to use the key.

C.

1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system.

2. In the external key management partner system, grant access for this key to use your Google Cloud project.

D.

1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).

2. In Cloud KMS, grant your Google Cloud project access to use the key.

Question 18

Your company plans to move most of its IT infrastructure to Google Cloud. They want to leverage their existing on-premises Active Directory as an identity provider for Google Cloud. Which two steps should you take to integrate the company’s on-premises Active Directory with Google Cloud and configure access management? (Choose two.)

Options:

A.

Use Identity Platform to provision users and groups to Google Cloud.

B.

Use Cloud Identity SAML integration to provision users and groups to Google Cloud.

C.

Install Google Cloud Directory Sync and connect it to Active Directory and Cloud Identity.

D.

Create Identity and Access Management (1AM) roles with permissions corresponding to each Active Directory group.

E.

Create Identity and Access Management (1AM) groups with permissions corresponding to each Active Directory group.

Question 19

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate,

and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.

What should you do?

Options:

A.

Use the Cloud Key Management Service to manage the data encryption key (DEK).

B.

Use the Cloud Key Management Service to manage the key encryption key (KEK).

C.

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Question 20

What are the steps to encrypt data using envelope encryption?

Options:

A.

Generate a data encryption key (DEK) locally.

Use a key encryption key (KEK) to wrap the DEK. Encrypt data with the KEK.

Store the encrypted data and the wrapped KEK.

B.

Generate a key encryption key (KEK) locally.

Use the KEK to generate a data encryption key (DEK). Encrypt data with the DEK.

Store the encrypted data and the wrapped DEK.

C.

Generate a data encryption key (DEK) locally.

Encrypt data with the DEK.

Use a key encryption key (KEK) to wrap the DEK. Store the encrypted data and the wrapped DEK.

D.

Generate a key encryption key (KEK) locally.

Generate a data encryption key (DEK) locally. Encrypt data with the KEK.

Store the encrypted data and the wrapped DEK.

Question 21

Your company's users access data in a BigQuery table. You want to ensure they can only access the data during working hours.

What should you do?

Options:

A.

Assign a BigQuery Data Viewer role along with an 1AM condition that limits the access to specified working hours.

B.

Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraints for BigQuery during the specified working hours.

C.

Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours

D.

Run a gsuttl script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.

Question 22

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

Options:

A.

Cloud Bigtable

B.

Cloud BigQuery

C.

Compute Engine SSD Disk

D.

Compute Engine Persistent Disk

Question 23

You are the Security Admin in your company. You want to synchronize all security groups that have an email address from your LDAP directory in Cloud IAM.

What should you do?

Options:

A.

Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have “user email address” as the attribute to facilitate one-way sync.

B.

Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have “user email address” as the attribute to facilitate bidirectional sync.

C.

Use a management tool to sync the subset based on the email address attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.

D.

Use a management tool to sync the subset based on group object class attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.

Question 24

When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.)

Options:

A.

Ensure that the app does not run as PID 1.

B.

Package a single app as a container.

C.

Remove any unnecessary tools not needed by the app.

D.

Use public container images as a base image for the app.

E.

Use many container image layers to hide sensitive information.

Question 25

A security audit uncovered several inconsistencies in your project's Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.​

B.

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.​

C.

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.​

D.

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.​

Question 26

You are creating an internal App Engine application that needs to access a user’s Google Drive on the user’s behalf. Your company does not want to rely on the current user’s credentials. It also wants to follow Google- recommended practices.

What should you do?

Options:

A.

Create a new Service account, and give all application users the role of Service Account User.

B.

Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User.

C.

Use a dedicated G Suite Admin account, and authenticate the application’s operations with these G Suite credentials.

D.

Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user.

Question 27

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.

Enable Access Transparency Logging.

B.

Deploy resources only to regions permitted by data residency requirements

C.

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.

Deploy Assured Workloads.

Question 28

The security operations team needs access to the security-related logs for all projects in their organization. They have the following requirements:

Follow the least privilege model by having only view access to logs.

Have access to Admin Activity logs.

Have access to Data Access logs.

Have access to Access Transparency logs.

Which Identity and Access Management (IAM) role should the security operations team be granted?

Options:

A.

roles/logging.privateLogViewer

B.

roles/logging.admin

C.

roles/viewer

D.

roles/logging.viewer

Question 29

Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?

Options:

A.

Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.

B.

Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.

C.

In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.

D.

Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.

Question 30

You are working with a client that is concerned about control of their encryption keys for sensitive data. The client does not want to store encryption keys at rest in the same cloud service provider (CSP) as the data that the keys are encrypting. Which Google Cloud encryption solutions should you recommend to this client? (Choose two.)

Options:

A.

Customer-supplied encryption keys.

B.

Google default encryption

C.

Secret Manager

D.

Cloud External Key Manager

E.

Customer-managed encryption keys

Question 31

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.

What should your team do to meet these requirements?

Options:

A.

Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups.

B.

Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups.

C.

Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory.

D.

Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

Question 32

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Question 33

You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project.

What should you do?

Options:

A.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation.

B.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.

C.

In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User.

D.

In Resource Manager, edit the organization permissions. Add the project ID as member with the role: Compute Image User.

Question 34

You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs?

Options:

A.

Google Cloud Armor

B.

Cloud NAT

C.

Cloud Router

D.

Cloud VPN

Question 35

You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?

Options:

A.

Use Google Cloud Directory Sync to convert the unmanaged user accounts.

B.

Create a new managed user account for each consumer user account.

C.

Use the transfer tool for unmanaged user accounts.

D.

Configure single sign-on using a customer's third-party provider.

Question 36

Your organization develops software involved in many open source projects and is concerned about software supply chain threats You need to deliver provenance for the build to demonstrate the software is untampered.

What should you do?

Options:

A.

• 1- Generate Supply Chain Levels for Software Artifacts (SLSA) level 3 assurance by using Cloud Build.

• 2. View the build provenance in the Security insights side panel within the Google Cloud console.

B.

• 1. Review the software process.

• 2. Generate private and public key pairs and use Pretty Good Privacy (PGP) protocols to sign the output software artifacts together with a file containing the address of your enterprise and point of contact.

• 3. Publish the PGP signed attestation to your public web page.

C.

• 1, Publish the software code on GitHub as open source.

• 2. Establish a bug bounty program, and encourage the open source community to review, report, and fix the vulnerabilities.

D.

• 1. Hire an external auditor to review and provide provenance

• 2. Define the scope and conditions.

• 3. Get support from the Security department or representative.

• 4. Publish the attestation to your public web page.

Question 37

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.

Admin Activity

B.

System Event

C.

Access Transparency

D.

Data Access

Question 38

Your organization s record data exists in Cloud Storage. You must retain all record data for at least seven years This policy must be permanent.

What should you do?

Options:

A.

• 1 Identify buckets with record data

• 2 Apply a retention policy and set it to retain for seven years

• 3 Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs

B.

• 1 Identify buckets with record data

• 2 Apply a retention policy and set it to retain for seven years

• 3 Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission

C.

• 1 Identify buckets with record data

• 2 Enable the bucket policy only to ensure that data is retained

• 3 Enable bucket lock

D.

* 1 Identify buckets with record data

• 2 Apply a retention policy and set it to retain for seven years

• 3 Enable bucket lock

Question 39

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Question 40

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.

How should you advise this organization?

Options:

A.

Use Forseti with Firewall filters to catch any unwanted configurations in production.

B.

Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.

C.

Route all VPC traffic through customer-managed routers to detect malicious patterns in production.

D.

All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Question 41

You plan to use a Google Cloud Armor policy to prevent common attacks such as cross-site scripting (XSS) and SQL injection (SQLi) from reaching your web application's backend. What are two requirements for using Google Cloud Armor security policies? (Choose two.)

Options:

A.

The load balancer must be an external SSL proxy load balancer.

B.

Google Cloud Armor Policy rules can only match on Layer 7 (L7) attributes.

C.

The load balancer must use the Premium Network Service Tier.

D.

The backend service's load balancing scheme must be EXTERNAL.

E.

The load balancer must be an external HTTP(S) load balancer.

Question 42

Which international compliance standard provides guidelines for information security controls applicable to the provision and use of cloud services?

Options:

A.

ISO 27001

B.

ISO 27002

C.

ISO 27017

D.

ISO 27018

Question 43

You need to enable VPC Service Controls and allow changes to perimeters in existing environments without preventing access to resources. Which VPC Service Controls mode should you use?

Options:

A.

Cloud Run

B.

Native

C.

Enforced

D.

Dry run

Question 44

In a shared security responsibility model for IaaS, which two layers of the stack does the customer share responsibility for? (Choose two.)

Options:

A.

Hardware

B.

Network Security

C.

Storage Encryption

D.

Access Policies

E.

Boot

Question 45

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Options:

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question 46

Your organization wants full control of the keys used to encrypt data at rest in their Google Cloud environments. Keys must be generated and stored outside of Google and integrate with many Google Services including BigQuery.

What should you do?

Options:

A.

Create a Cloud Key Management Service (KMS) key with imported key material Wrap the key for protection during import. Import the key generated on a trusted system in Cloud KMS.

B.

Create a KMS key that is stored on a Google managed FIPS 140-2 level 3 Hardware Security Module (HSM) Manage the Identity and Access Management (IAM) permissions settings, and set up the key rotation period.

C.

Use Cloud External Key Management (EKM) that integrates with an external Hardware Security Module

(HSM) system from supported vendors.

D.

Use customer-supplied encryption keys (CSEK) with keys generated on trusted external systems Provide the raw CSEK as part of the API call.

Question 47

You are auditing all your Google Cloud resources in the production project. You want to identity all principals who can change firewall rules.

What should you do?

Options:

A.

Use Policy Analyzer lo query the permissions compute, firewalls, create of

compute, firewalls. Create of compute,firewalls.delete.

B.

Reference the Security Health Analytics - Firewall Vulnerability Findings in the Security Command Center.

C.

Use Policy Analyzer to query the permissions compute, firewalls, get of compute, firewalls, list.

D.

Use Firewall Insights to understand your firewall rules usage patterns.

Question 48

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

Options:

A.

Create a Log Sink at the organization level with a Pub/Sub destination.

B.

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.

Enable Data Access audit logs at the organization level to apply to all projects.

D.

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Question 49

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

Options:

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Question 50

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

Options:

A.

Send all logs to the SIEM system via an existing protocol such as syslog.

B.

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Question 51

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Question 52

A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.

Which two strategies should your team use to meet these requirements? (Choose two.)

Options:

A.

Configure Private Google Access on the Compute Engine subnet

B.

Avoid assigning public IP addresses to the Compute Engine cluster.

C.

Make sure that the Compute Engine cluster is running on a separate subnet.

D.

Turn off IP forwarding on the Compute Engine instances in the cluster.

E.

Configure a Cloud NAT gateway.

Question 53

You work for a large organization where each business unit has thousands of users. You need to delegate management of access control permissions to each business unit. You have the following requirements:

Each business unit manages access controls for their own projects.

Each business unit manages access control permissions at scale.

Business units cannot access other business units' projects.

Users lose their access if they move to a different business unit or leave the company.

Users and access control permissions are managed by the on-premises directory service.

What should you do? (Choose two.)

Options:

A.

Use VPC Service Controls to create perimeters around each business unit's project.

B.

Organize projects in folders, and assign permissions to Google groups at the folder level.

C.

Group business units based on Organization Units (OUs) and manage permissions based on OUs.

D.

Create a project naming convention, and use Google's IAM Conditions to manage access based on the prefix of project names.

E.

Use Google Cloud Directory Sync to synchronize users and group memberships in Cloud Identity.

Question 54

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a

job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 55

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question 56

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

Options:

A.

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Question 57

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.

1. Configure the option to suspend domain users not found in LDAP.

2. Set up a recurring GCDS task.

B.

1. Configure the option to delete domain users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.

2. Set up a recurring GCDS task.

D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

Question 58

You are working with protected health information (PHI) for an electronic health record system. The privacy officer is concerned that sensitive data is stored in the analytics system. You are tasked with anonymizing the sensitive data in a way that is not reversible. Also, the anonymized data should not preserve the character set and length. Which Google Cloud solution should you use?

Options:

A.

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

B.

Cloud Data Loss Prevention with format-preserving encryption

C.

Cloud Data Loss Prevention with cryptographic hashing

D.

Cloud Data Loss Prevention with Cloud Key Management Service wrapped cryptographic keys

Question 59

You work for a healthcare provider that is expanding into the cloud to store and process sensitive patient data. You must ensure the chosen Google Cloud configuration meets these strict regulatory requirements:​

    Data must reside within specific geographic regions.​

    Certain administrative actions on patient data require explicit approval from designated compliance officers.​

    Access to patient data must be auditable.​

What should you do?

Options:

A.

Select multiple standard Google Cloud regions for high availability. Implement Access Control Lists (ACLs) on individual storage objects containing patient data. Enable Cloud Audit Logs.​

B.

Deploy an Assured Workloads environment in multiple regions for redundancy. Utilize custom IAM roles with granular permissions. Isolate network-level data by using VPC Service Controls.​

C.

Deploy an Assured Workloads environment in an approved region. Configure Access Approval for sensitive operations on patient data. Enable both Cloud Audit Logs and Access Transparency.​

D.

Select a standard Google Cloud region. Restrict access to patient data based on user location and job function by using Access Context Manager. Enable both Cloud Audit Logging and Access Transparency.​

Question 60

A company has redundant mail servers in different Google Cloud Platform regions and wants to route customers to the nearest mail server based on location.

How should the company accomplish this?

Options:

A.

Configure TCP Proxy Load Balancing as a global load balancing service listening on port 995.

B.

Create a Network Load Balancer to listen on TCP port 995 with a forwarding rule to forward traffic based

on location.

C.

Use Cross-Region Load Balancing with an HTTP(S) load balancer to route traffic to the nearest region.

D.

Use Cloud CDN to route the mail traffic to the closest origin mail server based on client IP address.

Question 61

Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this?

Options:

A.

Configure Secret Manager to manage service account keys.

B.

Enable an organization policy to disable service accounts from being created.

C.

Enable an organization policy to prevent service account keys from being created.

D.

Remove the iam.serviceAccounts.getAccessToken permission from users.

Question 62

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Question 63

In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized.

Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.)

Options:

A.

App Engine

B.

Cloud Functions

C.

Compute Engine

D.

Google Kubernetes Engine

E.

Cloud Storage

Question 64

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

    The network connection must be encrypted.

    The communication between servers must be over private IP addresses.

What should you do?

Options:

A.

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

Question 65

An administrative application is running on a virtual machine (VM) in a managed group at port 5601 inside a Virtual Private Cloud (VPC) instance without access to the internet currently. You want to expose the web interface at port 5601 to users and enforce authentication and authorization Google credentials

What should you do?

Options:

A.

Modify the VPC routing with the default route point to the default internet gateway Modify the VPC Firewall rule to allow access from the internet 0.0.0.0/0 to port 5601 on the application instance.

B.

Configure the bastion host with OS Login enabled and allow connection to port 5601 at VPC firewall Log in to the bastion host from the Google Cloud console by using SSH-in-browser and then to the web application

C.

Configure an HTTP Load Balancing instance that points to the managed group with Identity-Aware Proxy (IAP) protection with Google credentials Modify the VPC firewall to allow access from IAP network range

D.

Configure Secure Shell Access (SSH) bastion host in a public network, and allow only the bastion host to connect to the application on port 5601. Use a bastion host as a jump host to connect to the application

Question 66

Your organization uses a microservices architecture based on Google Kubernetes Engine (GKE). Security reviews recommend tighter controls around deployed container images to reduce potential vulnerabilities and maintain compliance. You need to implement an automated system by using managed services to ensure that only approved container images are deployed to the GKE clusters. What should you do?

Options:

A.

Enforce Binary Authorization in your GKE clusters. Integrate container image vulnerability scanning into the CI/CD pipeline and require vulnerability scan results to be used for Binary Authorization policy decisions.​

B.

Develop custom organization policies that restrict GKE cluster deployments to container images hosted within a specific Artifact Registry project where your approved images reside.​

C.

Build a system using third-party vulnerability databases and custom scripts to identify potential Common Vulnerabilities and Exposures (CVEs) in your container images. Prevent image deployment if the CVE impact score is beyond a specified threshold.​

D.

Automatically deploy new container images upon successful CI/CD builds by using Cloud Build triggers. Set up firewall rules to limit and control access to instances to mitigate malware injection.​

Question 67

Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.

What should you do?

Options:

A.

• 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring

• 2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly

B.

• 1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium

• 2 Monitor the findings in SCC

C.

* 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring

• 2 Activate Confidential Computing

• 3 Enforce these actions by using organization policies

D.

• 1 Use secure hardened images from the Google Cloud Marketplace

• 2 When deploying the images activate the Confidential Computing option

• 3 Enforce the use of the correct images and Confidential Computing by using organization policies

Question 68

You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your Google Cloud VPCs based on packet header information. However, you want the capability to explore network flows and their payload to aid investigations. Which Google Cloud product should you use?

Options:

A.

Marketplace IDS

B.

VPC Flow Logs

C.

VPC Service Controls logs

D.

Packet Mirroring

E.

Google Cloud Armor Deep Packet Inspection

Question 69

Your organization is using Active Directory and wants to configure Security Assertion Markup Language (SAML). You must set up and enforce single sign-on (SSO) for all users.

What should you do?

Options:

A.

1. Manage SAML profile assignments.

• 2. Enable OpenID Connect (OIDC) in your Active Directory (AD) tenant.

• 3. Verify the domain.

B.

1. Create a new SAML profile.

• 2. Upload the X.509 certificate.

• 3. Enable the change password URL.

• 4. Configure Entity ID and ACS URL in your IdP.

C.

1- Create a new SAML profile.

• 2. Populate the sign-in and sign-out page URLs.

• 3. Upload the X.509 certificate.

• 4. Configure Entity ID and ACS URL in your IdP

D.

1. Configure prerequisites for OpenID Connect (OIDC) in your Active Directory (AD) tenant

• 2. Verify the AD domain.

• 3. Decide which users should use SAML.

• 4. Assign the pre-configured profile to the select organizational units (OUs) and groups.

Question 70

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP). The customer’s internal compliance requirements dictate that end-user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP’s native SYN flood protection.

Which product should be used to meet these requirements?

Options:

A.

Cloud Armor

B.

VPC Firewall Rules

C.

Cloud Identity and Access Management

D.

Cloud CDN

Question 71

Your organization's application is being integrated with a partner application that requires read access to customer data to process customer orders. The customer data is stored in one of your Cloud Storage buckets. You have evaluated different options and determined that this activity requires the use of service account keys. You must advise the partner on how to minimize the risk of a compromised service account key causing a loss of data. What should you advise the partner to do?

Options:

A.

Define a VPC Service Controls perimeter, and restrict the Cloud Storage API. Add an ingress rule to the perimeter to allow access to the Cloud Storage API for the service account from outside of the perimeter.​

B.

Scan the Cloud Storage bucket with Sensitive Data Protection when new data is added, and automatically mask all customer data.​

C.

Ensure that all data for the application that is accessed through the relevant service accounts is encrypted at rest by using customer-managed encryption keys (CMEK).​

D.

Implement a secret management service. Configure the service to frequently rotate the service account key. Configure proper access control to the key, and restrict who can create service account keys.​

Question 72

Applications often require access to “secrets” - small pieces of sensitive data at build or run time. The administrator managing these secrets on GCP wants to keep a track of “who did what, where, and when?” within their GCP projects.

Which two log streams would provide the information that the administrator is looking for? (Choose two.)

Options:

A.

Admin Activity logs

B.

System Event logs

C.

Data Access logs

D.

VPC Flow logs

E.

Agent logs

Question 73

A centralized security service has been implemented by your company. All applications running in Google Cloud are required to send data to this service. You need to ensure that developers have high autonomy to configure firewall rules within their projects, while preventing accidental blockage of access to the central security service. What should you do?

Options:

A.

Deploy a central Secure Web Proxy and connect it to all VPC networks. Create a Secure Web Proxy policy to allow traffic to the central security service.

B.

Implement a hierarchical firewall policy that prioritizes the central security service by allowing its connections and directing all other traffic to the subsequent firewall level.

C.

Create a central project to manage Shared VPC networks which will be accessible to all other projects. Administer all firewall rules centrally within this project.

D.

Use Terraform to automate the creation of the required firewall rule in all projects. Restrict rule change permissions solely to the Terraform service account.

Question 74

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Hide Matching Entries.

4.Make sure the resulting list is empty.

B.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Show Matching Entries.

4.Make sure the resulting list is empty.

C.

1. In BigQuery, select the related dataset.

2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.

1. Go to the IAM section on the project.

2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Question 75

You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least

Privilege.

What should you do?

Options:

A.

Configure an ingress policy for the perimeter in Project A and allow access for the service account in Project B to collect messages.

B.

Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is located in Project A.

C.

Create a perimeter bridge between Project A and Project B to allow the required communication between both projects.

D.

Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project A.

Question 76

You want to use the gcloud command-line tool to authenticate using a third-party single sign-on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.)

Options:

A.

SSO SAML as a third-party IdP

B.

Identity Platform

C.

OpenID Connect

D.

Identity-Aware Proxy

E.

Cloud Identity

Question 77

Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet.

What should your team grant to Engineering Group A to meet this requirement?

Options:

A.

Compute Network User Role at the host project level.

B.

Compute Network User Role at the subnet level.

C.

Compute Shared VPC Admin Role at the host project level.

D.

Compute Shared VPC Admin Role at the service project level.

Question 78

You want to update your existing VPC Service Controls perimeter with a new access level. You need to avoid breaking the existing perimeter with this change, and ensure the least disruptions to users while minimizing overhead. What should you do?

Options:

A.

Create an exact replica of your existing perimeter. Add your new access level to the replica. Update the original perimeter after the access level has been vetted.

B.

Update your perimeter with a new access level that never matches. Update the new access level to match your desired state one condition at a time to avoid being overly permissive.

C.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter configuration. Update the perimeter configuration after the access level has been vetted.

D.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter dry run configuration. Update the perimeter configuration after the access level has been vetted.

Question 79

A customer’s data science group wants to use Google Cloud Platform (GCP) for their analytics workloads. Company policy dictates that all data must be company-owned and all user authentications must go through their own Security Assertion Markup Language (SAML) 2.0 Identity Provider (IdP). The Infrastructure Operations Systems Engineer was trying to set up Cloud Identity for the customer and realized that their domain was already being used by G Suite.

How should you best advise the Systems Engineer to proceed with the least disruption?

Options:

A.

Contact Google Support and initiate the Domain Contestation Process to use the domain name in your new Cloud Identity domain.

B.

Register a new domain name, and use that for the new Cloud Identity domain.

C.

Ask Google to provision the data science manager’s account as a Super Administrator in the existing domain.

D.

Ask customer’s management to discover any other uses of Google managed services, and work with the existing Super Administrator.