/r/googlecloud
The goto subreddit for Google Cloud Platform developers and enthusiasts.
The goto subreddit for Google Cloud Platform developers and enthusiasts.
We do not allow advertising of your job posting, product, or software without active discussion and/or as attempt to solve a problem. We are fine with soliciting feedback for something you're working on or something you've written, but drive-by advertising will get your post removed. :)
More Google Cloud sub-reddits
Other cloud sub-reddits
/r/googlecloud
I have to submit a project tomorrow. I tried creating an account but I'm stuck at step 2 since the last 5 days. Please help me🥲
Unable to create 300$ free GCP account. I'm stuck at step 2 where account method verification is done. I have entered all correct details and after verification my amount has been deducted. I'm getting an error which says this action couldn't be completed [OR_BACRR_44]. Please help me
*is When someone request a quota increase such as pub/sub is that quota increase is increased for finite time or permanently. As far as i know from ai, its said quota increase is mainly managed by automated system...
I want to create Google cloud platform free trial account. This error message pops up after payment method verification. My amount got debited but the error occurs. Please help ðŸ˜
Is there a documentation that outlines the limits or restrictions on the number of API Keys that can be created in GCP API Gateway?
Edit: Found it, apparently it's 300 per Project https://cloud.google.com/docs/authentication/api-keys#limits
Got stucked at using the "Secret Manager" despite trail and error following the doc here, while I was not be smart enough to make it work , I also can't help to wonder, is it freaking necessary to make this works so complex? Apparently one has to:
register secret - fair
Give owner access of these secrets to the corresponding service accounts - why the fuck is this necessary, if anyone who can log in can grant the access, what's the fucking point in this?
Special syntax, apparently you have to
steps
, add an availableSecrets
 field to specify the secret version and the environment variable to use for the GitHub token --- Didn't I already list what secrets do I need?args
 field, specify it using the environment variable prefixed with $$
-- Why the earth the special treatment here, what is wrong with the good old ${}?args
 field, add a -c
 flag as the first argument. Any string you pass after -c
 is treated as a command. For more information on running bash commands with -c
, see the bash documentation -- another pointless documentatioAll these can be solved if the engineer cared to write some working code ... I don't understand, what the fuck is the team using to measure documentation success?
Hi
I have pubsub subscription with the the following settings:
Delivery type = Pull
Message retention duration = 10 minutes
Expiration period= 14 days
Acknowledgment deadline – 60 seconds
I have also ticked Enable exactly once delivery and Message ordering.
Retry policy is Retry immediately.
I am using Airflow (Composer) to read from the subscription (and write to BigQuery table).
I want to read only the latest messages every time.
Obviously I can make sure by using the Airflow script not to write twice the same message, but is there anything I can do to configure the subscription so I won't read twice the message?
File Source: Google Drive (file name: varies, but let's say Data.csv)
File Destination: Google Cloud Storage bucket (bucket name: data)
Automation tool: Power Automate
File Overwrite: Yes if file name already exists
Automation Cadence: Weekly
Is this possible? I don't see a GCP or GCS connector in Power Automate. But can I use a HTTP PUT/POST action? I'm not clear on my next steps.
I found these 2 pages about uploading objects using REST api and JSON api: https://cloud.google.com/storage/docs/uploading-objects#rest-upload-objects and https://cloud.google.com/storage/docs/json_api/v1/objects/insert but I'm not sure how to translate the information into Power Automate HTTP action inputs. The REST api requires an access token which makes sense to me, but the JSON api doesn't mention it at all, so how would the POST action tell my "data" bucket apart from potentially another organisation's "data" bucket if there's no identifier?
My Power Automate flow is currently sitting like this:
Any help would be appreciated.
Hello everyone,
I’m a Certified Databricks Data Engineer and have recently started working with Google BigQuery at my new employer. I’m planning to take the Google Professional Cloud Data Engineer certification exam and would really appreciate guidance, suggestions, and preparation tips from this group.
Specifically: 1. What resources (courses, books, or labs) did you find most helpful? 2. Are there any key topics or areas I should focus on more? 3. Any particular hands-on labs, practice tests, or dumps that helped?
Guys I’m new to AI would so I would like to know which techniques we have to use to build a model that can scans the data and identify whether data is HIPPA compliance or not ?
Any guidance would be appreciated
I’ve completed the Google Cloud Skills Boost path and am eager to begin practicing exams and revising to assess my retention. Reading lengthy questions from practice exams can be extremely challenging for me due to my ADHD. I would greatly appreciate any advice and tips on how to enhance my retention for such a technical exam. Additionally, I’m also preparing for the AZ-500 exam, which adds to my apprehension.
Lately i have been wanting a personalized central place to track an array of my information (banking history, TODO lists, fitbit history, etc.). I have been tracking most of it in a sheets file. Is CloudSQL too over kill for personal use?
I have a service account in a GCP project that has domain-wide delegation configured in my Google Workspace (I can see that the OAuth 2 Client ID for the service account in the project matches the Client ID listed in the Google Workspace's Security > API Controls > Domain-wide Delegation list). I want to use this service account within a Cloud Function to create new Workspace users and set their organizational unit paths (by impersonating my actual admin account through that domain-wide enabled service account from w/in the function). Note that the Cloud Function itself runs under a different service account (the default service account for the GCP project that had all the needed permissions to build the cloud function). Could anyone with more experience lmk what is the correct way to authenticate using the delegated service account within the function for this purpose? Not sure where to start.
IDK if this the general process described is the best way to do this, but, again, ultimately all I'm trying to do is use the cloud function to create new users in GSuite/Workspace and set their OU paths.
Thanks.
Let me apologize in advance if this question prompts eye rolls or GTFO and RTFM responses. I realize that I'm very new at this, but I have very much tried to solve my question on my own but I'm spinning my wheels by now.
My challenge: I can't seem to associate the Cloud Domain that I've resistered with the Google Cloud Buckets and Instance that I've created to get started hosting a static website to storefront my resume, and some portfolio examples. Should be pretty easy stuff, right?
I've added every DNS record that I can think of, and added the ownership html file to show ownership, and yet nothing.
I'm sure that I'm missing something small that I've overlooked.
But-- point me in the right direction?
Huge thanks in advance!
My registered domain doesn't show up here, and I can't figure out how to get it to appear.
Just as the title suggests, has anyone set this up? I am attempting to now but running into SO many issues and errors and the git hub directions are awful and there is zero resources elsewhere.
How can I preserve the client IP when I am using a Global Application Load balancer?
I took the ace on Nov 30th and passed, I took the ADP on Dec 1st and I'm still waiting on the results, I took the PCA yesterday and passed but I have no professional experience. I have a few projects hosted on GCP like an AI with a memory system hosted on app engine and accessible through a GUI on the web and a local client that has speech with a wake word so i can just leave it on all the time and ask it questions whenever I want.
Where's that put me though? Are employers going to look at it and think "snatch that guy up" or is it just gonna be like all the other certifications I see people posting on LinkedIn where you get some congrats and maybe a person or two to add as connections?
I don't have a degree but I am currently in Computer Science with a focus on Artificial Intelligence program. I don't know how that's gonna look either with it not being completed. I'm wanting to apply to some jobs but not if I'm just gonna be wasting my time. If i did start applying what jobs should i be focused on? I want to work in AI/ML but I'd be happy with anything even adjacent to that until i get my degree. almost anything in tech would look good on a resume right?
I don't know anyone in tech let alone cloud computing or AI and I actually can't think of ever even meeting someone who is IRL. If I ask ChatGPT it'll say go for it but it wouldn't be the first time that it gave me advice that didn't line up with the reality of the situation.
I'd appreciate any advice you can give me as long as it's constructive.
I only have 6 potential poll options so have missed out Professional Security Engineer, Professional Network Engineer, Professional Machine Learning Engineer and Foundational Cloud Digital Leader, so upvote my comments of these certs if they're your choice.
If you're not planning to take a cert or thinking of taking something more platform-independent, tell me more below!
Maybe someone is using https://github.com/albertcht/python-gcs-image the system and I lost the records where I wrote down the name of the image in Cloud Storage and the codes that this system returns, maybe there is a way to find the original names of the images?
Hi, so i am in need to use both for my bachelor's thesis. however i am confused with the configuration
is there anybody that I can ask to?
How do i make custom monitoring dashboard, to be specific to observe the vps network, disk io & other metrics and logs, which isn't hosted on gcp
I keep getting this error while trying to register a device. I'm trying to get an authorization file so I can use Google Assistant on Windows. I tried to create a new project, I tried a new browser. What do I do?
Probably a stupid question, but how does VPC SC know the Project and the VPC network an API call is originating from? What are the safeguards against a caller spoofing the Project & VPC information?
So, for example. Let's say I have Storage API in project A protected by a service perimeter. And outside of the perimeter, there's a VM located in a project B in VPC network B. And there's an Ingress policy that allows access to Storage API from Project B and VPC network B. Private Google Access is enabled, so the call goes through the Google's network.
So, when an API call to the Storage comes in, how the originating Projects & VPC networks are determined? Are they somehow encoded in the request? Or in an access token? Or is it taken care of at Andromeda / SDN level?
Cloud Architect with GCP Expertise: How Can I Become a Google Partner or Join Their Startup Program?
I’ve been blogging for a while, and my posts have garnered over 600,000 views so far, with about 2,000 monthly visitors. Recently, a publication with over 2 million monthly views reached out to me to publish articles with them.
Feedback from platforms like medium, substack, etc and others consistently highlights my ability to articulate complex technical topics in an engaging way.
Now, I’m exploring ways to monetize my blog and leverage my skills for a side hustle. I’m thinking of expanding beyond just writing—maybe creating e-books, workshops, or a technical community.
I’d love advice from anyone who’s turned content creation
I’m a cloud architect and SRE with deep expertise in Google Cloud Platform (GCP). Over the years, I’ve worked extensively on building secure foundations, automating infrastructure with Terraform, and optimizing cloud operations for scalability and reliability.
I’m now exploring ways to either become a Google Premium Partner or get into the Google Startup Program to expand my opportunities and network.
For those who’ve been through this journey:
I’d love to hear your insights or experiences!"
Any recommendations for good PCA practice tests?
Would like to hear experience from those who have taken the exam.
Hi, building cost dashboards for GCP (product already works for aws/azure). My GCP account is fresh and doesn't have much spend/resources. Hence need to collaborate with a good business/company/user with $5k-10k monthly spend. Needless to say, my solution will be free for them. If they already have aws/azure, they will see more value. Let me know if anyone is interested being a beta tester.
Now there is a bucket of 5 terabytes and we are thinking how to reduce the cost, these are all pictures from ads and not all pictures are frequently accessed and for this I want your opinion, who has already used or how optimized the system
google cloud skill arcade how to complete labs fast is there any way to complete it faster to earn points
For example for me it takes couple of days.
Project is pretty "typical" (what ever that is). GKE, Database, Network, Redis, Cloud tasks.
I am looking for a ways to bring this time way down.
Is there any good ways to do it?