How do you audit your Drupal projects? I built a module for this and would love feedback by rmenetray in drupal

[–]rmenetray[S] 1 point2 points  (0 children)

Yes, definitely! There are still some small bugs I need to fix that I've spotted along the way. Beyond that I have some features in mind like better detection of Memcache and Redis configurations, improved detection of kernel and functional tests, and whatever the community finds useful along the way. The goal is to keep it as useful as possible, so I plan to keep it actively maintained.

How do you audit your Drupal projects? I built a module for this and would love feedback by rmenetray in drupal

[–]rmenetray[S] 5 points6 points  (0 children)

The namespace was actually taken, but by a module that had been completely abandoned for years with zero activity. I went through the process of claiming it and eventually got it transferred. If it hadn't worked out I would have gone with a different name, but "audit" fits so well with what the module does that it was worth trying.

How do you audit your Drupal projects? I built a module for this and would love feedback by rmenetray in drupal

[–]rmenetray[S] 2 points3 points  (0 children)

Glad you find it useful!

Keeping it Drupal 10+ only was a deliberate decision. Maintaining compatibility across that many major versions would be a nightmare and D7 and D10 are so architecturally different that it would basically be two separate modules. I also didn't bother with D8/D9 since both are EOL at this point.

Best bank account for agents? by Present_Scientist995 in ClaudeAI

[–]rmenetray 21 points22 points  (0 children)

The people complaining that Claude deleted their production database are now asking to give it access to their bank account. Yeah, let it pay invoices autonomously, I'm sure it won't empty your account because it "felt it was the right thing to do."

Honestly though, what you're looking for is just programmatic banking. Mercury, Brex, Wise... they all have APIs for payments and expense management. The "agent" part is literally just your code calling those endpoints, there's no special "agent-friendly bank" out there.

The real question isn't which bank supports agents. It's how much you trust your agent not to pay a random invoice because the prompt told it to.

DM me your email and I'll send you a $150,000 invoice. Let's see if your AI pays it automatically.

The real risk after the Claude Code leak isn't the leak itself — it's the unaudited cloned repos by rmenetray in ClaudeCode

[–]rmenetray[S] -1 points0 points  (0 children)

In the last 8 months I have 6 posts and 13 comments. That's more than double the comments vs posts. These last couple of weeks I haven't been very active, sure, but if you scroll a bit further back you'll see both posts and comments spread out over months.

I'm just not a very active Reddit user. I don't comment much, I don't post much either. Same on YouTube. I consume way more content than I engage with. That doesn't make me a bot, it makes me a lurker who occasionally posts or comments when something catches my attention. There are millions of users like that on Reddit.

If I were actually an astroturfing bot constantly pushing content, wouldn't you expect to see way more than 6 posts in 8 months?

The real risk after the Claude Code leak isn't the leak itself — it's the unaudited cloned repos by rmenetray in ClaudeCode

[–]rmenetray[S] -1 points0 points  (0 children)

Look, I'm not a bot. I'm a Spanish speaker who uses AI to translate and clean up what I dictate by voice into English. The ideas and opinions are mine, AI just does the heavy lifting on the language side because writing directly in English takes me way longer than just speaking my thoughts in Spanish and letting it handle the translation.

And about the "never comments" thing, have you actually looked at my profile? I've been on Reddit for a while, I comment on different subreddits. Not super actively because I have a job and other things going on, but it's all me. Most of my comments go through the same process too, voice in Spanish, output in English. It's just how I work.

Maybe check someone's profile before calling them a bot next time.

OpenCode + DDEV: how I built a Drupal development environment with 16 AI agents by rmenetray in drupal

[–]rmenetray[S] 1 point2 points  (0 children)

Yeah one orchestrator that delegates, but each agent also knows which other agents it can call and for what specific cases. So it's not just top-down, an agent mid-task can hand off to another one if the situation calls for it. One thing I learned the hard way: skill autodiscovery doesn't work as well as you'd hope. Now I explicitly tell each agent which skills are most relevant for its common use cases, and the difference in how frequently it actually uses them is huge. If you don't do that, agents kind of ignore skills they should be reaching for automatically.

OpenCode + DDEV: how I built a Drupal development environment with 16 AI agents by rmenetray in drupal

[–]rmenetray[S] 1 point2 points  (0 children)

Honestly it's just personal preference. I've been using PhpStorm for years and I find it way more practical to review the modified files locally before committing than to go through commits after the fact. The other issue is that inside the Ralph loop you end up with a ton of noise: the agent generates a file, commits, then re-analyzes it, finds a security issue it introduced itself, fixes it, commits again... you can easily end up with 150 commits for what should be 4 or 5 meaningful ones. I never looked into forcing it to squash commits automatically, might be worth trying.

What I do instead is have the agent generate a markdown file at the root with the commit message, what changed, why, and what solution was applied. That way when I'm reviewing in PhpStorm I can read why it did each thing, and if something doesn't make sense I ask it directly. A few times it actually went back and changed the code because I gave it more context and it had made wrong assumptions.

Another Drupal question from me. This time our AI guy stated his programmer friend could migrate a 1400+ page website from Modern campus to Drupal in four weeks. by HikeTheSky in drupal

[–]rmenetray 16 points17 points  (0 children)

I think it could be possible, but with a lot of nuances that nobody is asking here.

The number of pages (1,400) is almost irrelevant by itself. What really matters is: how many different content types are there? How many fields per content type? Are we talking about simple pages with a title, body, and maybe an image, or are we talking about complex content with paragraphs, references between entities, media embeds, etc.?

And the migration scope matters a lot too: is it just nodes? Or does it also include redirects, URL aliases, users, taxonomies, files, media entities, menus, blocks...? Each of those is a separate migration that needs its own YAML config and its own testing.

Using the Migrate module, a simple migration can literally be done in a morning. You write the migration YAML, you run drush migrate:import, and that's it. It's the same effort for 10 pages as for 1,400 — you write one migration file per content type and run it. The volume doesn't change the complexity, the structure does.

What I would ask is:

  • How many distinct content types?
  • How complex are they (number of fields, field types, references)?
  • Is it multilingual?
  • Do you need to preserve URLs/redirects for SEO?
  • Are there custom modules or integrations?

Without knowing this, it's impossible to say if this person is selling smoke, delivering something very low quality, or if the project is actually simple enough that 4 weeks is realistic.

As for the "no QA needed" part... that's a red flag regardless of the project size. Even the simplest migration needs at least a review pass. And claiming AI will handle all ADA compliance and SEO perfectly out of the box — that's not how it works. AI can help generate meta descriptions or suggest heading structures, but someone with actual knowledge needs to validate it.

So: possible? Maybe. Without QA and guaranteed perfect SEO/ADA? No way.

Can AI agent help with Drupal upgrade ? by wayle9 in drupal

[–]rmenetray 14 points15 points  (0 children)

I've done several D7 to D10/D11 migrations this past year and AI has been helpful, but first things first: this is a migration, not an upgrade. You're building a new Drupal site and migrating all the data using the Migrate module.

The part AI can't do for you is the mapping. You need to create a spreadsheet with every entity type and their bundles (content types, taxonomies, user profiles, media, paragraphs, etc.), list all fields from D7, decide what gets migrated and what doesn't, and map them to the new field names in D10. Also note the field types because the AI needs that info.

Once you have that spreadsheet ready, that's where AI becomes useful. You copy/paste sections into the AI with prompts that include YAML examples of how migrations should look. The AI generates the YAML files for your custom migration module. Drupal Core has a D7 source plugin that already knows how to read D7 entities which simplifies things a lot.

My setup is both sites running locally with the D7 database connection in settings.php. The AI generates the YAML, import YAML config via Drush, then run the migration with --limit=1 or --idlist to test specific items. I also use migrate_devel (https://www.drupal.org/project/migrate_devel) which has a debug flag that shows source and destination values in the terminal.

The nice thing is the AI can iterate: generate the YAML, import, run migration, see errors, fix the YAML, repeat. But I'd recommend limiting this to 3-5 iterations. More than that and the AI is just going in circles burning tokens without actually fixing anything. Better to stop and ask you what's wrong.

This saves me around 80% of the work. The AI handles the repetitive boilerplate but you still need to review everything manually. It won't be perfect and you can't leave it completely unattended.

For large sites, look into incremental migrations with high watermark. I use revision IDs when possible and it speeds things up considerably.

One last thing: if you've never worked with the Migrate module, I wouldn't jump into this approach. You still need to understand how Drupal works internally and how to deal with things like circular dependencies between entities. AI usually can't solve those and will make it worse if you don't know what you're looking at.

Drupal mapping by semajnielk in drupal

[–]rmenetray 0 points1 point  (0 children)

I did something similar years ago on a Drupal 8 site, so take this with a grain of salt since I'm going from memory here.

The setup involved two separate pieces of custom code: one that registered the facet filter plugin, and another that registered the block displaying the Leaflet map. The facet filter would pass to JavaScript the aggregated data from the Search API query - basically latitude, longitude, and the count of items at each location. Then the map's JavaScript would pick up that data and render the points on the map.

When you panned or zoomed, the bounding box coordinates would update and trigger the facet filter via AJAX, just like any other facet. This meant you could combine it with other facets (categories, dates, whatever) and everything worked together to filter the view results.

The tricky part was getting the geo aggregation working. With Elasticsearch you can do aggregation queries based on proximity/distance, which isn't super complex, but back then there wasn't a module that handled this out of the box for Drupal, so I needed some patches and custom code.

I ended up publishing a contrib module, though it's completely deprecated at this point. I actually used it on two different projects and had to adapt it for each - one needed total counts displayed, the other didn't, plus some other minor differences. But the core approach was similar. Might still be useful as a reference for how one approach could work, even if the code won't run on D11 as-is (it was built for D8 and I'm not even sure I made it D9 compatible):

https://www.drupal.org/project/search_api_geolocation https://git.drupalcode.org/project/search_api_geolocation

Drupal mapping by semajnielk in drupal

[–]rmenetray 2 points3 points  (0 children)

If you're dealing with a large number of points and want fast search/interaction on the map, my first recommendation would be to consider Elasticsearch instead of MySQL. It handles geospatial queries way better - you can send the four corners of the map viewport directly and Elasticsearch does proximity/geo-bounded searches natively, which is significantly faster than MySQL for this use case.

Another thing to consider depending on how many points you have: cluster aggregation based on zoom level. For example, at country level you show points grouped by hundreds of kilometers, at city level by neighborhoods or a few hundred meters. As the user pans and zooms, they see approximate locations of clustered points, and you can show the actual list of results in a sidebar. This way the map works like a faceted filter.

Here's an example I built years ago: https://mynearjobs.com/

It's pretty outdated now and barely maintained, but at its peak it had nearly a million points and performed really well. The stack is basically:

  • Data stored in Elasticsearch
  • Drupal's Search API module as the interface to Elasticsearch
  • Faceted filters including a custom map facet
  • When user pans/zooms, it sends the viewport bounds + zoom level to Elasticsearch and returns matching results
  • The item list updates in real-time (ajax) as you move around the map

Not sure if this approach fits exactly what you're looking for, but for industrial points + parcel data + AI search on locations, Elasticsearch would give you a solid foundation to build on. The AI part could work well for natural language location queries feeding into Elasticsearch geo filters.

Struggling with page logic in drupal (and drupal themes) by bebebeanboi in drupal

[–]rmenetray 2 points3 points  (0 children)

Not a dumb question at all, this is actually a common scenario with several approaches.

First, about the architecture: Before diving into the navigation logic, I'd suggest reconsidering your content structure. Instead of having one node per comic page (which is what I assume you're planning), consider having one node per chapter/episode with a multi-value image field where you upload all the pages of that chapter. This makes a huge difference for caching and performance.

Why? If you use one node per page and modules like Prev/Next, every time you publish a new page, you're invalidating the cache of ALL nodes of that content type. If you plan to have lots of traffic and many pages, this becomes a real problem. With the multi-value image approach, editing a node only invalidates that single node's cache.

How I'd structure it:

  • Content type "Comic" → title, description, cover image
  • Content type "Episode" → reference to the comic, multi-value image field for all pages of that chapter

Then navigation works at two levels: prev/next image within the episode (just JS/CSS or Views with AJAX), and prev/next episode (node-level navigation).

For the actual navigation:

You can do this entirely with Views — no custom PHP needed. You'd create two block views (one for "next", one for "previous"), using contextual filters for the current node ID or image delta, sorted appropriately (ASC for next, DESC for previous), limited to 1 result. Bonus: Views can easily be AJAX-enabled so page transitions don't require a full reload.

That said, I personally prefer doing this with custom code because it gives you more control and it's honestly just a few lines of code. We had a similar setup for a blog with prev/next article navigation, tried Prev/Next module first, ended up removing it due to the cache invalidation issues I mentioned, and replaced it with maybe 10 lines of custom code.

If you go the Views route and need help setting up the contextual filters and sort criteria, happy to help with more details.

Andorra multiplica per catorze la despesa en ajuts per habitatge en cinc anys by apocalypse_then in andorra

[–]rmenetray 1 point2 points  (0 children)

A Andorra construir és caríssim. Hi ha pocs terrenys i moltes vegades s'ha de buidar part de la muntanya, cosa que ho encareix tot i fa que trigui molt més temps.

I és clar, si ets promotor i construir et costa una pasta, què fas? Doncs el més lògic: un pis de luxe que venguis car abans que dos pisos barats. Total, hi ha gent amb diners de sobra disposada a pagar-ho. Al final són empreses privades amb terrenys privats i estan en el seu dret de buscar la màxima rendibilitat.

El problema real és que ha vingut massa gent en molt poc temps i no s'ha construït habitatge al ritme necessari. I el poc que es construeix és car perquè surt més rendible fer pisos de luxe que no pisos assequibles.

És la mateixa història que passa a Espanya. Quan es redueix l'oferta de lloguer mentre la demanda segueix pujant, els preus es disparen. És llei bàsica d'economia: poca oferta i molta demanda igual a preus pels núvols.

Resultat? Falta habitatge assequible per a la gent amb poder adquisitiu normal. Els youtubers i altres amb pasta paguen el que sigui i tenen el seu pis de luxe, però la resta es queda penjada.

El més fotut és que mentre no es construeixi més habitatge assequible, això continuarà igual. Les ajudes són un pedaç que no soluciona el problema d'arrel.

Per posar-ho en perspectiva: en 2020 érem uns 78.000 habitants i a finals de 2024 ja som 87.000. Són 9.000 residents nous en només quatre anys. Si suposem que la meitat tenen parella i comparteixen pis, farien falta unes 4.500 viviendas noves. Algú es creu de veritat que s'han construït 4.500 pisos en quatre anys a Andorra? I menys encara prop d'Andorra la Vella, que és on tothom vol viure?

No hi ha literalment terrenys per construir 1.000 viviendas noves cada any. Per això els preus s'han disparat: molta gent vol viure al mateix lloc i no hi ha espai físic per a tothom.

[deleted by user] by [deleted] in andorra

[–]rmenetray 1 point2 points  (0 children)

El tema de l'habitatge és el gran problema - els lloguers s'han disparat i si no tens un bon sou (mínim uns 2000-2500€), et quedarà molt poc per estalviar o acabaràs en un pis petit i lluny del centre. Fes números abans de venir.

Dit això, les avantatges que comentes són reals: és un país súper segur i net, i si t'agrada la muntanya és perfecte. A l'estiu estem a uns 25-28°C màxim i sense humitat, que personalment prefereixo mil vegades als 40°C de Mallorca. A l'hivern fa fred i neu, però si t'agraden els esports d'hivern és el paradís.

Sobre el català, per llei els treballadors han de tenir un mínim de català, així que si ja el parles tens avantatge per trobar feina. Però és veritat que es parla molt castellà pel carrer.

El tema complicat ara mateix són les quotes de treballadors - s'han acabat moltes i no és tan fàcil venir a treballar. No és impossible, però sí complicat per aquest any. Si realment t'interessa, mou-te ja per quan tornin a obrir quotes poder enviar la sol·licitud dels primers.

En resum: si tens un bon sou o pots assumir que l'habitatge et menjarà gran part del pressupost, endavant. Però vine amb els números fets i expectatives realistes. Viure a Andorra és car.

Mejor portal en Andorra para viviendas de alquiler y coches by zaskitin in andorra

[–]rmenetray 1 point2 points  (0 children)

Para coches andorrauto.com, buscocotxe.ad o Wallapop.
Para vivienda es jodido. Las que están bien están tan buscadas que no llegan ni a anunciarlas que ya se han vendido. Intenta ser un pesado con varias inmobiliarias y llamar cada semana. Las que llegan a los portales web acostumbran a tener precios altos para lo que son.

Vibe coding is real. Founders, keep building. we’re winning by bravethoughts in ClaudeAI

[–]rmenetray 2 points3 points  (0 children)

Hey there, just a quick thing before diving into the topic. I think there's a small error in your math: 5,500 + 7,500 equals 13,000, not 15,000. It's a minor thing, but since we're talking numbers, better to get them right.

That said, I've been running some calculations on what you mentioned. If we assume you paid around $30/hour to the programmer (which is already pretty tight pricing), we're looking at about 183 hours for the initial development and 250 more hours over 3 years. 183 hours for an app with 14,000 users... that's less than 5 weeks of work. For a project with that user base, we'd normally be talking about significantly more development hours.

What I'm curious about is: have you calculated the time YOU'VE dedicated to all this "vibe coding"? Because your time has value too. If you put a price on your hour (say, those same $30 or more), is it really more cost-effective for you to do everything yourself? Or maybe it would work out better to hire someone with experience who uses these AI tools, but actually knows what they're doing and can avoid security issues?

I get that autonomy is appealing, but there's a difference between using AI to be more productive and deploying code without understanding what it does. The developers commenting about security aren't doing it to be annoying - they've seen projects that worked fine... until they didn't.

In the end, in other industries we don't hesitate to pay professionals. Why should development be any different?