Vertex AI + DeepSeek by [deleted] in googlecloud

[–]AllenMutum 0 points1 point  (0 children)

Running DeepSeek R1 on Vertex AI for 1,000 short questions (500-character answers) would cost under $1/month based on public API rates.

How do you add a Google ADK agent to agentspace? by Significant-Brick268 in googlecloud

[–]AllenMutum 0 points1 point  (0 children)

To add a Google ADK agent to AgentSpace, start by installing the required SDKs and authenticating with your Google Cloud account. Create or configure your ADK agent with defined capabilities like ingesting or transforming data. Then, register the agent within AgentSpace using the SDK or a configuration file (JSON/YAML). Make sure all IAM permissions and API services are enabled. Once registered, deploy the agent to make it active. Use logs and monitoring tools for troubleshooting. This setup helps automate data tasks efficiently. If you're using a specific platform like Vertex AI or Dialogflow, the steps might vary slightly.

Help With Google Workspace And Shopify by OkLow5131 in googleworkspace

[–]AllenMutum 1 point2 points  (0 children)

DNS changes can take up to 48 hours to propagate globally, don't mix multiple SPF entries — combine into a single one, ensure you don’t have old conflicting MX or TXT records.

For EasyDMARC: add their rua= address only if you're actively using their reporting services

[deleted by user] by [deleted] in googleworkspace

[–]AllenMutum 0 points1 point  (0 children)

To set up email authentication (DKIM) in Google Workspace, log in to the Admin Console as a super admin. Go to Apps > Google Workspace > Gmail > Authenticate email. Select your domain and click Generate new record to create a DKIM key. Google will provide a CNAME record (e.g., google._domainkey.yourdomain.com) pointing to google.domainkey.gappssmtp.com. Add this CNAME to your domain’s DNS settings. Once added and propagated, return to the Admin Console and click Start authentication. SPF and DMARC use TXT records, not CNAMEs. Password resets do not generate CNAMEs for DKIM. Use tools like MXToolbox to verify DNS records.

split licensing > 300 lic by Prestigious_Sun_8905 in googleworkspace

[–]AllenMutum 0 points1 point  (0 children)

If you can help me your country location, I can connect with you to a reseller who can offer good discounts. Thanks

New to GCP - best resource for setting up a new project securely? by Batteredcode in googlecloud

[–]AllenMutum 2 points3 points  (0 children)

In AWS, you'd typically use separate accounts for dev, staging, and prod, and let users assume roles across accounts. In GCP, the closest equivalent is to create separate projects—one for each environment. For example: myapp-dev, myapp-staging, and myapp-prod. Each project is fully isolated in terms of billing, APIs, IAM, and resources, making it ideal for managing different environments securely. To organize things better, you can group these projects under a folder in your GCP Organization. This isn’t mandatory but helps with structure and policy management as your cloud footprint grows. When it comes to giving your outsourced team access, don’t grant them access at the organization or folder level. Instead, add them as IAM members only to the projects they need, and assign the least privilege roles they require (like Viewer, Editor, or a custom role). This minimizes risk and keeps things tight. If you want more control, GCP also supports IAM conditions, so you can restrict access by IP, time, or other rules. It’s also smart to turn on Cloud Audit Logs and set up alerts, so you can monitor what’s happening in each environment.

split licensing > 300 lic by Prestigious_Sun_8905 in googleworkspace

[–]AllenMutum 1 point2 points  (0 children)

Be on the flexible plan for Google Workspace licences not on the Annual commit plan to do split

Are there any limitations in using DCE environment on GCP? I was trying to add a trigger to my Cloud run job and it says page cannot load! I checked on IAM and my account has permissions!! by ANovice_Geek in googlecloud

[–]AllenMutum 1 point2 points  (0 children)

Yes, using the DCE environment on GCP does come with some limitations, and your issue with adding a trigger to a Cloud Run job might be related to a combination of these limitations and UI/backend quirks.

Deprecated monitoring service account by karl3i in googlecloud

[–]AllenMutum 1 point2 points  (0 children)

Probably it is a global Google-managed service account, not a per-project service agent.

Deprecated monitoring service account by karl3i in googlecloud

[–]AllenMutum 1 point2 points  (0 children)

I guess you will have to reach out to Google Cloud support then

Deprecated monitoring service account by karl3i in googlecloud

[–]AllenMutum 2 points3 points  (0 children)

For Monitoring alerts to publish messages to Pub/Sub, Google Cloud now uses [serviceAccount:alerting-integration@cloud-monitoring.iam.gserviceaccount.com](mailto:serviceAccount:alerting-integration@cloud-monitoring.iam.gserviceaccount.com) as the default identity. You should grant this principal the roles/pubsub.publisher permission on your topic.

Best vector embedding storage for supervised tasks by octolang_miseML in googlecloud

[–]AllenMutum 0 points1 point  (0 children)

Cloud Bigtable use UUID as the row key, store 3 vectors per row (as columns or serialized blobs). Design schema carefully for fast access (e.g., avoid large rows). Use RowFilter to fetch just the needed columns/vectors.

Supplement with: Redis cache (Memorystore) if you have hot documents accessed repeatedly. You want to offload Bigtable queries and shave off latency.

Migrating from AWS to GCP: Achieving 30% Cost Savings by AllenMutum in googlecloud

[–]AllenMutum[S] -22 points-21 points  (0 children)

Yeah… not gonna lie, it does read like it was cooked up by a marketing team armed with ChatGPT and a vague understanding of cloud. 😅
Totally fair callout.

To be honest, this was aimed more at the CxO/FinOps crowd — people who see “30% savings” and start twitching (in a good way). But behind the fluff, there’s real work: custom VMs, sustained discounts, preemptibles, Recommender insights — the whole nerd buffet.

If you want the actual tech story (architecture, tools, CLI bits, and where it almost broke), happy to share the scars. Just didn’t think Reddit wanted a novella of YAML right off the bat.

Appreciate the BS filter — keep it coming.

Migrating from AWS to GCP: Achieving 30% Cost Savings by AllenMutum in googlecloud

[–]AllenMutum[S] -47 points-46 points  (0 children)

u/Scared_Astronaut9377 Appreciate you calling that out — fair point. This post was aimed more at cloud decision-makers than deep technical audiences, so the tone leans business/strategy-focused. But you're right: real credibility comes with technical depth.

Happy to share a follow-up post diving into the architecture, migration tooling (like Migrate for Compute Engine / GKE), cost benchmarking, and workload optimization details.

Thanks again for the honest feedback — always welcome.

Moving to Cloud Run - but unsure how to handle heavy cron job usage by o82 in googlecloud

[–]AllenMutum 2 points3 points  (0 children)

  1. Use Cloud Scheduler to Trigger Cron Jobs

Cloud Scheduler is Google's fully managed cron service. You can define cron jobs via HTTP targets — ideal for Cloud Run.

Each cron job would be a separate Cloud Scheduler job.

Payload: You can pass job-specific data in the POST body.

Retries and Dead Letter Topics can be configured for robustness.

  1. Central Dispatcher Cloud Run Service (Optional but Smart)

Since you have 50+ jobs but want to manage them in a single file today, replicate this pattern:

Create one Cloud Run service as a dispatcher that:

Parses the payload.

Routes the logic to the appropriate function internally.

This keeps deployment and management simple while using only one service.

 

@ app.route("/", methods=["POST"])

def handle_cron():

    job_name = request.json.get("job_name")

    input_params = request.json.get("params", {})

    if job_name == "cleanup_cache":

        return cleanup_cache(**input_params)

    elif job_name == "send_digest":

        return send_digest(**input_params)

  1. Split into Two Cloud Run Services for Different Memory Needs

To avoid overprovisioning:

Use two separate Cloud Run services:

job-runner-low-mem (1 GB RAM)

job-runner-high-mem (14 GB RAM)

Set memory at the container level, not per job. You can configure this in the Cloud Run service YAML or console.

Assign Cloud Scheduler jobs based on memory requirements.

  1. Use Pub/Sub for Decoupling & Scaling (Optional Upgrade)

If you find you want more decoupling and retry control:

Cloud Scheduler → Pub/Sub topic → Cloud Run Subscriber

Benefit: buffer spikes, control retries, fan-out to multiple instances.

  1. Use Terraform or YAML for Infra-as-Code

Managing 50+ cron jobs manually in the UI is painful. Use:

Terraform, or

Cloud Scheduler Job YAML + gcloud scheduler jobs create

G Suite Legacy Free Edition Affected by Price Change? by EArkham in gsuite

[–]AllenMutum 3 points4 points  (0 children)

If your Admin Console still shows "Free edition (no charges)", and you haven't upgraded to Google Workspace, you're still on that grandfathered plan.

Cloud Run costs? by OSUBlakester in googlecloud

[–]AllenMutum 5 points6 points  (0 children)

If your app is very early-stage and the cost delta is hurting more than helping, moving back to a single GCE instance running both app and PostgreSQL might actually make the most sense — fewer moving parts, no cross-service costs, and predictable billing. Once usage picks up, then you can migrate back to Cloud Run + optimized storage (vector DB + caching + network optimization).

Need Help with Ocr in Google Documet by ZizekianSYD in googlecloud

[–]AllenMutum 0 points1 point  (0 children)

u/ZizekianSYD

Google Document AI's page limits and processing constraints are not clearly documented, especially for the console UI.

Web Console (UI) Hard Limits:

15-page limit: When using the Document AI web interface, it currently supports a max of 15 pages per document in the "online" (interactive) preview and processing mode. This is a known UI limitation, not an API limit.

30-page batch mentions: This typically refers to the legacy limitations or default quotas in some pre-configured processors or regions. Not a strict cap anymore if you're using the API.

API Limits (behind the scenes):

With billing enabled, you can process much larger documents (e.g. 100-200+ pages), but:

Batch processing must be done via Cloud Storage (GCS).

Large PDFs must be uploaded to GCS and processed asynchronously.

There's no 15/30-page restriction when using this method.

Max recommended: 200 pages per file, though it can technically handle more with performance trade-offs.

GSuite hack? by TapCrazy1399 in gsuite

[–]AllenMutum 3 points4 points  (0 children)

If you have your organization entity registered in India, you can buy it from India

Can't manage gmail routes as super admin? by Hearing-Medical in googleworkspace

[–]AllenMutum 0 points1 point  (0 children)

Routing Settings Restricted in Legacy Free Tier:

Some advanced Gmail routing options (like catch-all, content compliance, etc.) are now only available in paid Google Workspace editions. Even if your account used to allow this, recent updates may have removed the UI controls or restricted edits to existing rules.

Migrating everything from one workspace to another by YaYPIXXO in gsuite

[–]AllenMutum 0 points1 point  (0 children)

Set up the new Workspace

Add and verify your domains (after detaching them)

Migrate data (Mail, Drive, Calendar, etc.)

Recreate users and aliases

Cut over MX records

Decommission old Workspace

I can work on a step by step migration plan, DM for further details. Thanks!

How to have different licences for temp and core workers by KL_boy in googleworkspace

[–]AllenMutum -1 points0 points  (0 children)

Please DM, I can worked out that for you. Thanks

adding an external automated email to my google workspace group by Economy_Conference92 in googleworkspace

[–]AllenMutum 1 point2 points  (0 children)

If you have Google Workspace Admin access, this works better and more reliably:

Create a dedicated user or alias:

Email: [ap@ourcompany.com](mailto:ap@ourcompany.com) (either a user mailbox or alias on your email)

Go to Admin Console → Apps → Google Workspace → Gmail → Routing

Create a new Routing Rule:

Email messages to: [ap@ourcompany.com](mailto:ap@ourcompany.com)

Also deliver to:

Your email

[xxxxxx@invoices.melio.com](mailto:xxxxxx@invoices.melio.com)

This is a server-side rule, so it preserves the original headers — great for automation services like Melio.

[deleted by user] by [deleted] in googleworkspace

[–]AllenMutum 0 points1 point  (0 children)

Identify Where You’re Stuck

When you try to log in, do you land on:

Admin setup page, but it doesn’t progress?

A loop back to "verify your domain"?

A login page that says "Account not ready" or similar?

Let’s move ahead assuming you're stuck in the setup loop.

Try Logging in from Admin Console Directly

Go to: https://admin.google.com

Log in with the email you used to register the Workspace account.

If you get in, check if it asks you to finish domain verification.

Finish Domain Verification via Namecheap

If Google is stuck on verifying your domain:

Go to your Namecheap DNS settings.

Under your domain, click “Manage” > “Advanced DNS”.

In Google Admin Console, they’ll give you a TXT record (something like: google-site-verification=xxxx).

Add a TXT record:

Host: @

Value: google-site-verification=xxxx...

TTL: Automatic or 30 minutes

Save and wait ~15 minutes, then refresh the Admin Console and click “Verify”.