/r/googlecloud

Photograph via snooOG

The goto subreddit for Google Cloud Platform developers and enthusiasts.

The goto subreddit for Google Cloud Platform developers and enthusiasts.

We do not allow advertising of your job posting, product, or software without active discussion and/or as attempt to solve a problem. We are fine with soliciting feedback for something you're working on or something you've written, but drive-by advertising will get your post removed. :)

More Google Cloud sub-reddits

Other cloud sub-reddits

/r/googlecloud

47,661 Subscribers

1

ETL using GCP Services

i want to create ETL process for accounts in facebook ads, google ads, tiktok ads, etc. so lets say i have around total 100 accounts, i want get campaigns, ads etc and their insights data into bigquery. I am not sure which services to use.
I have worked with cloud composer but that is a bit expensive. Cloud function is an option but we might hit timeout limit as accounts would be increasing. There is an option of dataflow but i am not sure if that should be used for this case.
Any help would be appreciated

Also ETL would scheduled, can also be triggered from Frontend

1 Comment
2024/05/12
18:35 UTC

1

How to resolve private DNS Apigee X w Cloud VPN

We have a setup where the client DC is on-prem and he has apigee x in the cloud, we have configured a VPN tunnelling between the cloud and their local network Everything works fine when calling IPs, but we are not able to resolve the DNS Note: the customer doesn't want to pay for Cloud DNS

So are you guys faced similar scenario on how to do so

Also BTW since apigee x is totally managed by Google we can't SSH to the machines

0 Comments
2024/05/12
18:06 UTC

1

Any good fully managed API management software for SMBs?

I'm looking for a centralized api management platform for things like documentation, monitoring, rate-limiting, etc (like Tyk, Kong, Apigee). I just want it to be simple, no deployment hassle, but still flexible. Team size ~20. The only solutions I see are too expensive or complicated.

Has anyone else had this problem? What are y'all using?

25 Comments
2024/05/12
16:57 UTC

0

GCP Cloud Run Weird Behavior NextJS

I recently created a NextJS app and deployed it to cloud run. It is a simple app that plays speech of someone saying a word and checks if you can spell it. However, instead of randomly selecting words, it does the same word every time. However, when I test on my local machine, this does not occur. Also, NextJS is not caching the responses, as I have it disabled. For some reason, the API route outputs the same value every time it is called. How do I fix this?

8 Comments
2024/05/12
16:01 UTC

10

VertexAI best practices. My model trains for days

Hi all, I am new to VertexAI. I have a custom NN model that I package in a docker image and run on VertexAI. My training goes on for 3-4 days on average. I am not sure if this is ok or not. My compute instances run time was also in days

Is it because my data is too large? Shall I break it into batches? Is my model script not optimised?

This is for in house experimenting, no deployment needed.

Thanks

8 Comments
2024/05/11
22:03 UTC

0

I'm on the free trial with the 300$ credit and I activated my full account, will I be charged outside the 300$ credit if i create a vm with a GPU or will I be charged from that 300$ and when the 300$ runs out I'll be charged directly from my card? I'm totally new to this.

2 Comments
2024/05/11
18:05 UTC

0

Price to high

Hey hi everyone,so I'm building I app,and since I'm not expert in Google cloud services I hire someone from I freelance websites to set up cloud infrastructure,so we are using 2 environments,1 for test and 1 for production using docks and cloudflare,so the app is still in test and monthly cost around 350 to 400 Is this normal We using only few services like 4 or 5, Since I'm not expert could be the person I hire is using my services for something different rather the my project, Plus on cloudflare on log in I can see to many local being use like China,Russia,India,America ect Thank you

17 Comments
2024/05/11
17:56 UTC

8

Best Practices for Performance of Hosted Website on GCP

Hi peeps !

I'm looking to gather some insights & share experiences on how to best optimize a website running on Google Cloud Platform. I've implemented several strategies already, but I’m keen to learn from the community about any additional precautions, procedures, or steps that might enhance performance further.

Here’s what I’ve done so far:

  1. Optimize Images/videos/media
  2. Minify CSS & JavaScript,
  3. Enable GZIP/Brotli Compression
  4. Caching Strategies
    • I've extensively configured caching on my Load Balancer and set up Cloud CDN to improve content delivery speed.

I'm curious to know what other precautions or steps you might be taking to ensure top-notch performance of your GCP-hosted websites. Any specific GCP tools or configurations you find indispensable? Would love to hear your tips and experiences!

Thank you for taking time out of your day and reading my post !

Anything helps 😀

4 Comments
2024/05/11
16:06 UTC

5

Authenticating enterprise customers to Artifact Registry?

What is the best way to allow enterprise customers to authenticate to our artefact repository to pull docker images? Assuming that it'll be done via some sort of automation on their, not a human driven process.

Reading this: https://cloud.google.com/artifact-registry/docs/docker/pushing-and-pulling#key

Would it be providing a service account they can load into their system?

I can see there is a stand alone credential helper but I don't understand how that works? Where are the credentials created and where are they being retrieved from?

Thanks!

Edit:

Ok, looks like the credential helper is when you don't want to specifically use gcloud. But I'd still need to provide a credential such as a service account. So I'd be better off providing them a service account directly and let them use the service account key method to auth to docker?

3 Comments
2024/05/11
08:13 UTC

1

push gmail notifications in Java backend

I have my backend made in Java, and I wanted when I received an email to activate an endpoint in my backend application, I researched and discovered pub/sub, but I didn't quite understand how to connect this to my application with Java and Springboot

2 Comments
2024/05/10
23:05 UTC

1

Cloud Vision AI QR Code Decoding

Just have a question. I'm trying to use Vision AI for a python project that just extracts text and qr codes from an image. I know Vision AI can detect 2D barcodes (qr codes) but I can't seem to figure out if it can decode them. Anyone know if this is the case or am I doing something wrong?

2 Comments
2024/05/10
21:36 UTC

11 Comments
2024/05/10
21:29 UTC

13

Why do companies use billing exports to BigQuery?

I am wondering why users/companies use the billing export feature when most (if not all) of the data is provided without charge in the Billing Dashboard and Reports tab of Cloud Billing.

I understand that data storage in BigQuery incurs minimal fees… but querying the data still incurs charges.

Wondering if anyone out there could tell me why they use the exports feature as opposed to (or in addition to) the built-in Billing features?

23 Comments
2024/05/10
20:38 UTC

6

Confused on Tier 2 CASA verification?

I'm trying to get a Laravel web application verified with google and they are requiring Tier 2 CASA Verification.

You are required to complete a CASA Tier 2 security assessment for your application (project number: redacted) by the following date: redacted. This assessment is required annually; to learn more, please visit the CASA website.

CASA assessment is done on a "first-come-first-serve" basis. This can take up to 6 weeks depending on how engaged and responsive you are in the whole process.Hence we strongly suggest you get started with the assessment as soon as possible. To know how, please read the instructions below.

You have the following options to complete your assessment:

1 - Tier 2 Authorized Lab Scan

For your Tier 2 CASA assessment you may contact our CASA authorized preferred partner TAC Security, with whom we have negotiated a discounted rate for Tier 2 CASA assessments. Alternatively, you may also contact any other CASA authorized lab to conduct your Tier 2 CASA Assessment.

2 - Tier 3 CASA Assessment

You can also opt-in to complete a Tier 3 assessment, by contacting CASA authorized TAC Security, or any other CASA authorized lab.

CASA Tier 3 is a comprehensive assessment that tests the application, the application deployment infrastructure and any user data storage location.

Tier 3 assessments have the following benefits:

Conducted and validated by the authorized labs giving your application high assurance of compliance with CASA standard If your application is listed on the Google WorkSpace Marketplace you will receive an independent security verification badge

For any questions on the Tier 2 or Tier 3 Authorized Lab Scan/Assessment, or if you need a due date extension, please reach out to your CASA authorized lab.

Useful resources

Refer to the following documentation for more information:

CASA Website CASA Tiering Other Tiers Process Important! Once you have addressed the issues above, reply directly to this email to confirm. You must reply to this email after fixing the highlighted issues to continue with the app verification process.

Based on the posts from users here it's not necessary to pay $540 yearly for a lab to do the assessment, we can just do it ourselves, but I have no idea where to do it. I have successfully completed the Static and Dynamic scans, so I'm ready to go, I just don't know where to post the results.

The App Defense Alliance website says to "Follow the emailed instructions to create an account (if this is your first CASA) and login." But I never got any emailed instructions to access a portal. https://appdefensealliance.dev/casa/tier-2/complete-submit

This whole process has been Kafkaesque so any help is greatly appreciated!

1 Comment
2024/05/10
20:30 UTC

2

Google Cloud Storage Image Loading Issue 403 Error with v3 Signer API Authentication

I'm new to Google Cloud Storage (GCS). I've been trying to setup my personal blog website. This website will be using images as well. For hosting images, I use GCS bucket with a load balancer with CDN caching.

When I try to load any blog post with images, the images from GCS gives 403 forbidden error when v3/signer API fails to authenticate. I want to make sure that user visiting my website without any Google login should be able to view images on my blog post.

Recently I did following with my GCS bucket:

  • Added CORS policy.

[
    {
        "origin": ["https://link-to-my-blogpost.com"],
        "responseHeader": ["Content-Type"],
        "method": ["GET"],
        "maxAgeSeconds": 3600
    }
]
  • Updated bucket permissions (access control) to fine-grained object level ACLs. Earlier it was set to uniform.
  • After this I ran a command to update ACL of bucket:

gsutil -m acl -r set public-read gs://my-bucket-name
  • Public access is subject to object ACLs.

I'm still facing 403 forbidden error due to which images are not getting loaded on my website. It would be a great help if anyone can help me figure out what I'm missing. Thanks!

Originally posted on StackOverflow - https://stackoverflow.com/questions/78461929/google-cloud-storage-image-loading-issue-403-error-with-v3-signer-api-authentica

0 Comments
2024/05/10
19:55 UTC

6

Python apps very slow on Google Cloud Run

I have deployed a python service on google cloud run and it is taking way more time than it does on my local docker container. I'm literally just installing pandas , pydantic and fastapi. Running the same api on a local docker container takes around 600-900ms but on cloud run the response time is around 3.5 seconds which is vey slow. I have already tried -

  1. Choosing the server closest to my location.
  2. Setting min instances = 1
  3. Removing unnecessary libraries from requirements.txt. (its just 3 packages - pandas, pydantic and fastapi)
19 Comments
2024/05/10
17:12 UTC

3

Vertex AI "Failed to create pipeline job"

I have been trying to train a simple classification model using Vertex AI, following this video: https://www.youtube.com/watch?v=aNWCzyCK4Us&t=311s

I started off with a newly created google cloud account, so no changes to anything. But I was not able to create a the pipeline job, getting this error (The account running job has owner IAM permissions):

Failed to create pipeline job. Error: Permission 'aiplatform.metadataStores.get' denied on resource '//aiplatform.googleapis.com/projects/390064272340/locations/europe-west2/metadataStores/default' (or it may not exist).

I am not sure that IAM perms actually has anything to do with this, but after playing around with them I have an account with these perms:

"AI Platform Admin", "Compute Admin", "Organization Administrator", "Owner", "Service Account Admin", "Storage Object Admin", and "Vertex AI Administrator"

But I still get the same error.

I am very new to Google Cloud, so I am not sure I am even looking in the right place.

Does anyone know what I have to do?

1 Comment
2024/05/10
15:12 UTC

3

Books or trainings for Google Cloud Certified Professional Cloud Network Engineer

Hi guys

Any recommendations for good book references and on demand courses/trainings for Google Cloud Certified Professional Cloud Network Engineer? I can't seem find any good exam guides or books for this cert, what i do find for GCP mostly is for Cloud Engineer, Cloud Architect or Data Engineer

0 Comments
2024/05/10
11:52 UTC

17

What is it like to work at SADA unbiasedly?

I know SADA is great but what’s the work life balance like and culture like. How’s the pay compared to Google cloud PS?

23 Comments
2024/05/10
11:51 UTC

2

Is IPv6 billed under free-tier for GCP compute VMs?

3 Comments
2024/05/10
10:47 UTC

0

Struggling to create service account key

Hi, very new to google cloud and struggling to create a service account key. The policy is enforced that prevents keys being created, I’ve tried to turn it off but I get this message.

You need permissions for this action.

Required permission(s): orgpolicy.policies.create, orgpolicy.policies.delete, orgpolicy.policies.update and orgpolicy.policy.get

I have no idea how I give the only account in this entire google environment the permissions to disable this policy.

There is one user who is a super admin and owner. No other users.

Any help for a google cloud newbie would be highly appreciated!

11 Comments
2024/05/10
10:46 UTC

3

Tracking VC changes to python Bigquery Notebooks on GitHub

I have been doing some python development in Bigquery notebooks in Bigquery studio. I would like to store these notebooks in GitHub and have their changes tracked. Unfortunately, I cant seem to find an appropriate solution that isn’t manually downloading the notebooks and uploading them to GitHub.

Does anyone have any experience doing similar? What options do I have? I’m hoping to automate this.

4 Comments
2024/05/09
18:51 UTC

0

Presales as fresher

I recently joined a company where my role is presales engineer. I recently joined the company just after graduation. I want advice of I should go into presales as a fresher or should I get experience as a developer first. I am not a person who likes coding, mostly I just scratch my head whenever I code but most people tell me you can learn coding once you move to job. I am considering later stages of my career if I do not grasp the coding I might not be far ahead. At the same time I feel like getting hands-on is necessary if I want to continue in presales. I am really confused please help me out!!

12 Comments
2024/05/09
18:23 UTC

2

Function costs ramping up from April, tips would be greatly appreciated

Hi guys,

So I just checked my bills, and from Mid-April I noticed an increased daily cost from Cloud Function, specifically "Idle Min-Instance Memory Allocation Time", and "Idle Min-Instance CPU Allocation Time". Is there a way I can get a an idea of what specific service is causing the spike? There is a fixed amount of cost everyday, without variations.

6 Comments
2024/05/09
18:05 UTC

10

Why don't the big cloud providers allow pulling from external docker registries?

It seems that most of the bigger cloud providers don't allow pulling images from an external docker registry for some reason. It would make things so much easier than have to push into their internal registries. Is there a reason for this? Other providers such as DigitalOcean etc allow connecting directly to external docker registries.

17 Comments
2024/05/09
17:53 UTC

1

GCP Newbie - How to create an Organizational Policy to require Tags/Labes?

Hello there, my background is in Azure, and im finding things a bit strange in GCP land. Im trying to do something that I considered to be quite simple. I want to require that certain folders in my organization require specific Tags or Labels. Im not sure of the difference between those, but in Azure, i used Tags.

Ive gone into Organizational Policies, but where i would expect to see "Create Policy" or similar, all i have is "+Custom Constraint".

Have i missed something glaringly obvious?

0 Comments
2024/05/09
15:47 UTC

1

gcloud storage copy - VM User vs. Service Account?

I'm learning Google Cloud Platform (GCP).

My question is: When I run this command on a virtual machine (VM):

gcloud storage cp gs://<MY_BUCKET_NAME_1>/cat.jpg gs://<MY_BUCKET_NAME_2>/cat.jpg

Does the gcloud command use my VM's user account or a service account to copy the file? How can I find it out which account is used ?

7 Comments
2024/05/09
13:16 UTC

4

Spark Compass migrates to Google Cloud with Aliz, boosting performance & reliability.

0 Comments
2024/05/09
12:31 UTC

4

How to automate shutting off my GCP VM when its idle ?

I'm having billing issues because im being charged for VMS that are idle or not connected, so im looking for a way to automate checking the VM and shutting it off if. Can I use Google Cloud Scheduler to check if my VM is idle and shut it off if it is?

4 Comments
2024/05/09
11:58 UTC

2

aytomate bigquery backup/export to GCS and vice versa

any one who have a documentation or who's done automated backups/export of bigquery to GCS? I'm looking at using composer and the bigquery operator, i can already export it to GCS but cant seem to do it from GCS to BQ using composer again.

Any tips and tricks?

1 Comment
2024/05/09
09:49 UTC

Back To Top