/r/googlecloud
The goto subreddit for Google Cloud Platform developers and enthusiasts.
The goto subreddit for Google Cloud Platform developers and enthusiasts.
We do not allow advertising of your job posting, product, or software without active discussion and/or as attempt to solve a problem. We are fine with soliciting feedback for something you're working on or something you've written, but drive-by advertising will get your post removed. :)
More Google Cloud sub-reddits
Other cloud sub-reddits
/r/googlecloud
I failed the PCA exam and wanted to take it before December so I wouldn't have to take the beta, what can I do? Is there any way to get the free voucher?
Hi everyone!
Yesterday i signed up to GCP as a learning project (my company uses it, so i'm trying to learn some stuff), and to host a small node application that some friends and i use (FoundryVTT for anyone interested). As such, i'd like to keep using GCP after the free trial is over but only if i can remain under the Free Tier limits.
I am currently seeing some charges under network and i can't figure out where they are coming from, so i look to you for a little guideance.
Yesterday (over 24hs ago) i did enter the topology tool on the GCP console, however after learning it was not free i avoided it. However, today i see some charges for it, so i'm a little confused.
The only other thing i have set up (remotely related to network) is a monitoring dashboard using this PromQL query:
compute_googleapis_com:instance_network_sent_bytes_count
And some alerts using this MQL query:
fetch gce_instance
| metric 'networking.googleapis.com/vm_flow/egress_bytes_count'
| align rate(6h)
| every 6h
| group_by [resource.project_id, resource.instance_id], [cumulative_egress: sum(value.egress_bytes_count * 21600)]
| window 30d
| group_by [resource.project_id], [total_30d_egress: sum(cumulative_egress)]
| condition cast_units(total_30d_egress, "By") > cast_units(524288000, "By")
(Side question: is that one okay?)
Any insights you can provide would be appreciated!
Edit: I also added a firewall rule to allow connections to a specific port.
Hey guys,
I am building an app in the Play Store and App Store which uses Google Cloud for it's backend.
Some time ago I submitted an application for the Google Cloud Startup program, for the €2.000 credits tier, as I don't qualify for the higher tier. I got rejected due to my website not being online (I have the domain but there's no site on there), while a website is really of no use to me at this point.
I just got called by someone from support about this rejection who told me to send an email to support and just put up a quick and dirty site or even just a small explanation with a link to the app stores, and retry.
Anyone any experience with a similar situation? Are the requirements for this €2.000 tier really this low? Or do I really need a proper landing page?
Thanks in advance!
Hello all,
Lets say you have a global application load balancer (GLB) with multiple NEGs (paired with cloud run) from different regions as its backend:
How do I know if the client IP will be routed to the correct/nearest region?
I am using Connectivity Tests to check if its routed correctly, but it only tells me if all backends are reachable.
As the title says I want to migrate from Regional to zonal cluster without losing my loadbalancer ip as it set else where and I cannot change it.
Hello! I'm looking for materials to prep for the PSE exam. Any suggestions, advice on how to prepare and tackle the questions? Which sections should I put focus on? How are the questions on the exam worded? I want to take it by end of December. Ideally 5-6 week prep and exam.
#gcp #googleexam #cybersecurity #question
Hello,
I'm looking to find a small, maybe 100 page book that covers the specifics found on the GCP ACE topics list. I'm already going through 2 courses but I'd like a small reference guide I can quickly study on my down time in addition to my notes. Any suggestions would be appreciated!
Hi there,
I have a question regarding GCP Global Load Balancing across multiple projects and regions.
From my understanding of GCP’s Cross-Project Load Balancing documentation, this setup seems to require Shared VPCs. For security reasons, I'd prefer to have isolated VPCs between regions to limit the blast radius in case of security breaches etc.
An alternative approach I’m considering is to set up separate regional external HTTPS load balancers for each region or project and use a Global HTTP(S) Load Balancer to route traffic to each of these regional load balancers. However, I haven't found any documentation confirming that this approach aligns with GCP’s best practices or is supported. How would limiting access from the Global ALB work here too?
Is Shared VPC the recommended solution for this type of cross-region, cross-project setup? And, is there a way to achieve this level of traffic distribution and isolation without Shared VPCs? Coming from an AWS background, I generally avoid VPC peering or sharing unless absolutely necessary, so I’d appreciate any guidance on whether Shared VPCs in GCP might offer security or operational advantages that I’m overlooking.
Thanks!
I’m maintaining a backend service with the following tech stack and operational details:
I aim to enhance service scalability and manage costs effectively. A potential solution is transitioning to a NoSQL database within GCP.
I appreciate any advice, best practices, or experiences with similar projects.
I need to fetch Google Reviews from multiple locations using the Google My Business API in a Google Cloud Function written in Node.js. The function will run daily via Google Cloud Scheduler. The main challenge I’m facing is handling OAuth authentication when the function is executed by a service account.
I have already submitted the access request form and can activate the Google My Business API in my Google Cloud project. However, I’m unclear about how to properly configure OAuth for a service account to access the API. I’ve used service accounts for other Google APIs before, but I’m unsure whether I need to use delegated access or follow a different OAuth flow specifically for the My Business API.
I was expecting more guidance in the documentation about this scenario, but I couldn’t find a clear explanation for using a service account with the Google My Business API. Any help or examples for setting this up would be appreciated.
Hi,
Can someone help me with below? Our data source system SAP ECC Migrated to AWS. But we got to know that there are two URLs to connect SAP JDBC through Google Dataproc. How can we connect to SAP ECC JDBC ?
Thanks in advance.
I've been wanting to explore Google Cloud and get the certification for my CV for a while, but as I’m still considered a minor in the UK, I’m not sure if I'm allowed to take the exam or even use Google Cloud itself. I wanted to ask if there really is a minimum age requirement to take the exam or use Google Cloud, or if I should consider using Azure or AWS instead. Is the certification even valuable, or should I focus on learning cloud computing at all? I'm pretty good at Python and understand machine learning concepts. Any help is appreciated. I've searched the internet, but I haven't found a clear answer.
I get that it's probably different for GCP exams and Workspace Exams
hi
I would like to know if the GCP docs are available as PDF or at least a fast way to get them in html?
this one below was good but it's not working, thanks in advance
Hey everyone, I am a bit of a beginner when it comes to large data pipelines, and I am looking for any advice.
We get updated data every month from a third party provider on an FTP site. There are two zip files, one for record deletes and one for upserts. The deletes one is typically small and runs no problem locally. The new records or upsert records is large, typically between 20-40 gbs. Unzipped, it consists of delimitated text files, each around 12 gbs.
I have a python script that unzips the zips, then iterates through the files within to do the deletes and do the upserts (first removing all indexes within the database table, and then at the end recreating the indexes).
I already use GCP for a bunch of different things, which is why I am asking here.
Our current code works well, but we have to run it locally and it takes close to 15 hours to run depending on how much data we get from them. Because of this, we often will lose our connection due to dropping internet connections or other complications. I am looking for a more robust permanent solution.
The only other thing I have tried is to run it within google collab. However, I hit memory errors on some of the larger upsert records.
Any suggestions would be much appreciated.
I have installed helmchart seperately on GKE, and get "failing to pull image" for all pod relate to this chart.On other chart like grafana work properly. Does anyone ever face to this issue. Thank you
I want to drop a database in CloudSQL, but problem is database is created by another user and I do not have credentials for this user. And when I try to remove this table from Cloud SQL interface I'm getting error below.
Invalid request: failed to delete database "databaseName". Detail: pq: must be owner of database databaseName. (Please use psql client to delete database that is not owned by "cloudsqlsuperuser").
How can I obtain an user with full super admin credentials so I can do whatever I want? (It's for a test deployment that I'm playing around)
Hi guys.
Recently started using GCP. Need some project ideas to learn more.
We have a fairly large account with google billing about 200k per month so have to a dedicated account manager. When we have some complex issues or some urgent issues we are introduced to some SME. I have always felt in last few years the SMEs are not much helpful for the actual solution which can be done for various issues at hand. E.g.
Every-time we have discussed a problem with SME generally I find them lacking in good solutions which are cost effective and fast to perform. Suggestions on the internet or brainstorming results in much better ideas. For all above issues except 4 found good solutions which eventually fixed the issues.
Hello, I completed a courses in coursera and I got a no-cost access to Google Cloud swag that I can order by using the voucher code. however the cost was not specified when the order process finished and, I , as a student outside United States, I can't risk having an order that may be expensive as stated that I may receive custom or duties fees notice upon delivery of the product as I may not be able to pay for it.
I'm in Philippines
Anyone received this before in Philippines and much was it?
If expensive for a non working student, how do i cancel it XD
I work for a charity and we had a contractor build a small web app for us, which they did using Firebase under their Google Cloud Organization. We now need to transfer this Firebase Project from their control to ours. Just changing the Project Owner leaves it in their Organization, so I've been trying to create our own Organization (we don't currently use GC, but I have some past experience with it and Workspace). I've created a GC account under our email domain, then tried to set up the Organization, where it clearly states:
"To use Google Cloud, you must use a Google identity service (either Cloud Identity or Workspace ) to administer credentials for users of your Google Cloud resources."
We don't need Workspace, and Cloud Identity has a free tier which is sufficient for us, so I choose "Sign Up For Cloud Identity" and fill out our details, including our Domain Name, at which point it warns:
"Someone at your organization is already using your domain for a Google service. To sign up for a new Google service, you’ll need to verify ownership of this domain."
This stops the process dead, so I follow the link to the help which says I have to "1) Sign up for a Google service with email verification, 2) Verify ownership of your domain, 3) Upgrade to or add the Google service you want to use", where 3) explicitly includes the Cloud Identity free tier using an Essentials account.
So I sign up for a free Google Workspace Essentials Starter account, set up the DNS TXT to verify the domain, but then I hit this part of Step 2:
"If you signed up for Essentials Starter edition in step 1: You'll be asked to upgrade to Enterprise Essentials to finish the domain-verification process."
Wait, whut? Here I was thinking this would be free, but now I have to pay at least £10 p/m? No, wait, there's 4 people who've created Starter accounts with our domain emails, so that's £50 p/m until I can kill the accounts.
What are my options here? Can I upgrade to Enterprise for just 1 month, then downgrade again to Starter, or am I trapped to always be paying Workspace Enterprise which we don't need? (Yes, we qualify for Nonprofit discount, but the paperwork at both ends to do that will take ages.) Would finding and killing the Workspace Starter accounts remove the requirement for Enterprise? We could just create a new Firebase Project without an Organization, but I'd really rather not.
TL:DR: Is there any way through this process where we can avoid paying for Google Workspace just to use the "free" Google Cloud / Cloud Identity features?
I have a static site build with NextJS hosted on Google Cloud Storage, and I’m running into an issue with page refreshes. When I navigate from https://example.com/auth to https://example.com/dashboard?platform=ABC, everything works as expected. But if I refresh the page at https://example.com/dashboard?platform=ABC, I get an error:
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
</Error>
It seems like Google Cloud Storage is looking for an exact file match with the query string, but can’t find it. Is there a way to prevent this error on page refreshes or handle query parameters correctly?
Hello Community,
I'm starting some Labs on Google Skills Boost but there are only 5 Labs that I can find for free.
Is there some significantly technical Lab that I could do with 0 credits apart from the Labs belo ?
#1 - A Tour of Google Cloud Hands-on Labs (Introductory, 45 minutes) - Identify key features of Google Cloud and learn about the details of the lab environment.
#2 - A Tour of Google Cloud Sustainability (Introductory, 60 minutes) - Find out why Google Cloud is the cleanest cloud in the industry by exploring and utilizing sustainability tools.
#3 - Google Cloud Pub/Sub: Qwik Start - Console (Introductory, 30 minutes) - Learn about this messaging service for exchanging event data among applications and services.
#4 - BigQuery: Qwik Start - Console - (Introductory, 30 minutes) - Query public tables and load sample data into BigQuery.
#5 - Predict Visitor Purchases with a Classification Model in BigQuery ML (Intermediate, 75 minutes) - Use data to run some typical queries that businesses would want to know about their customers' purchasing habits.
Additionally, the Learning Paths I've seen for certification preparation only seem to cover the certification structure, not the content itself. As in, all the learning paths I had a look at, don't seem to have any sort of Technical Lab.
Am I wrong ?
I have a coupon for google cloud exam but it expires on december . But i need a more time to study can i schedule exam after the expiration date now . Ex : i schedule exam on january now eventhough coupon expires on december.
I'm very new to both Laravel and Google Cloud. But I need to host a Laravel project on Google Cloud as a part of my university assignment. But I have no idea how or where to start. Can you guys guide me on how to get this thing done ?
I have an app that has FE and BE. In FE you create a link and then you can track clicks for it.
The issue I am having is that some users made some very popular links and now they are accessed sometimes thousands of times per second, which crushes my server. These users are not paying users anymore, but there is no way to remove the links from the internet. So even if I restricted them to make more tracking links, I cannot delete the ones created.
I use Cloud Run and I am wondering if there is a way to filter out traffic coming via a specific link without that being processed via the CPU - maybe it's a stupid question, but I would appreciate your insights. Many thanks!
Learn how to simplify Prometheus metrics scraping in Google Cloud! Explore various methods for collecting metrics from VMs, GKE clusters, and Cloud Run using best practices. Reduce maintenance cost and get rid of toil such as managing Prometheus service. costs by leveraging Google Cloud managed solutions. Read more: https://leoy.blog/posts/scrape-prometheus-metrics-in-gcp/
This post kicks off a series about Google Cloud Managed Service for Prometheus, use of PromQL and Grafana on Google Cloud and more.
Hey there, fellow tech enthusiasts and accidental big spenders! Grab your popcorn, because I've got a tale that'll make your wallet weep and your funny bone ache.
Picture this: Your humble narrator, armed with a shiny new Google account and a trusty debit card, decided to dip his toes into the magical world of Gemini 1.5 Pro. "What could possibly go wrong?" I thought, blissfully unaware of the financial rollercoaster I was about to board.
Fast forward four days - FOUR. WHOLE. DAYS. I log into my GCP console, expecting to see a modest bill for my AI shenanigans. Instead, I'm greeted by a number that made my eyes bulge and my soul leave my body momentarily: $1,310!
That's right, folks. In less time than it takes milk to expire, I've managed to rack up a bill that could buy me a decent used car or a lifetime supply of ramen noodles (which, at this rate, might become my new diet plan).
Now, I'm lost in the labyrinth that is the GCP console, desperately trying to figure out where my money went. It's like a really expensive game of "Where's Waldo?", except Waldo is my savings, and he's nowhere to be found.
So, my dear Redditors, I come to you with two burning questions:
If anyone has navigated these treacherous waters before and lived to tell the tale, your wisdom would be much appreciated. Bonus points if you can recommend a good cardboard box - I might need to start house hunting soon.
Remember, folks: When they say "Go big or go home," sometimes you end up doing both at the same time!
Tried to be a Gemini Pro, ended up a financial zero. Send help (and maybe some spare change).
Hello,
I'm making a website that I duplicate on several subdomain foo.example.com and bar.example.com . Both website are hosted on the same server with a reverse proxy (traefik which is similar to nginx). I use OAuth login with google credentials but eventually during the login process, the wrong uri is used. If I try to login on foo.example.com , after the login phase, I'm redirected on bar.example.com/auth, and obviously there's an error. But it's random, sometimes it's the good URI, and sometimes not.
However both subdomain have their own id client oauth2.0, and thus their own client id and client secret. And the callbacks URI and origin URI are correct for both website.
I'm not sure why I have this problem. Because the URI is used, the problem shouldn't be on the reverse-proxy side. And because they have different client oauth2.0, the problem shouldn't be in the redirection.