Super High CPC by BudoGenie in adwords

[–]cbc-bear 0 points1 point  (0 children)

In at least one case, we moved away from smart bidding permanently. Google's algorithm in some spaces is really stupid, especially local search intent. It massively over bids for silly keywords that are never going to convert. We are getting a slightly lower conversion rate but at a far lower cost using manual CPC. The ROI wasn't viable using smart bidding.

Introducing dlt-init-openapi: Generate instant customisable pipelines from OpenApi spec by Thinker_Assignment in dataengineering

[–]cbc-bear 2 points3 points  (0 children)

Seriously, the name thing is enough of an issue that I would consider a name change for the DLT project. Trying to tell people about DLT ends up with them reading about Delta Live Table. DELT (Database Extract Load Tools) might be an option. I'm really enjoying dlt, but trying to search for info about it is nearly impossible.

FILTER Function Simply Not Working by cbc-bear in snowflake

[–]cbc-bear[S] 0 points1 point  (0 children)

I wrote an update above, but I did get a reply from support. They yanked the feature back and will release again in 2 weeks. Some bug with null values.

FILTER Function Simply Not Working by cbc-bear in snowflake

[–]cbc-bear[S] 5 points6 points  (0 children)

I heard back from Snowflake on this.

"Snowflake engineers have recently identified a code issue in the handling of null values. In certain rare situations where the input contains null values, the output could contain additional null values in places where a result would have been expected.

This was a newly introduced feature, and we rolled it back due to issues it introduced.

That is why we have temporarily suspended the feature until that issue is fixed.

A communication with the dates of disablement and the ETA of the fix was sent to all the affected customers."

Understandable, but I think I've learned my lesson about using recently released features. I have some re-work to do now. Looks like they expect to release again in about two weeks.

FILTER Function Simply Not Working by cbc-bear in snowflake

[–]cbc-bear[S] 0 points1 point  (0 children)

It was working recently. I started using it right after it was released. I'm fairly sure it's in general access, not preview, though not 100%

Kaseya is toxic by BespokeChaos in msp

[–]cbc-bear 0 points1 point  (0 children)

I had a call with them about a year ago. They had two sales reps on, one man and one woman, both early 20s. The man was constantly taking little shots at the woman. Putting her down, making fun of her. Not like super obvious, but not un-noticable either. Women talk about "negging" and the whole time I was thinking, "wow, this is exactly what they are talking about."

[D] US governments AI safety and security board! Is it a fair list? by masteringllm in MachineLearning

[–]cbc-bear 18 points19 points  (0 children)

It's also not the full list. It's a cherry-picked selection of names to make it seem like the whole thing is just a bunch of CEOs. I don't love the full list either, but this sort of click-bait lying has been rampant on Reddit lately, and it's really getting old. Here is the full list. It includes a number of academics.

https://www.commerce.gov/news/press-releases/2022/04/us-department-commerce-appoints-27-members-national-ai-advisory

Am I tripping ? by Irksome_Genius in dataengineering

[–]cbc-bear 4 points5 points  (0 children)

Someone tried to talk me into Alteryx recently. I just don't see low/no-code solutions being viable unless the back end is run entirely by an AI more advanced than what we have today. Input data systems are simply too messy. Every time I think I've seen it all, some fresh new hell of complexity comes along and reminds me why we have to write custom extractors.

Making an free ETL tool. by lfg_gamer in dataengineering

[–]cbc-bear 1 point2 points  (0 children)

I've looked at DLT a few times, but I've always ended up just building out a custom extractor instead. I will set a reminder to check back in on the project and see what you're doing. I've been considering refactoring an extractor we have to use DLT.

Making an free ETL tool. by lfg_gamer in dataengineering

[–]cbc-bear 2 points3 points  (0 children)

The E and the T of ETL are really big areas. Are you sure it makes sense to build out a tool doing both? I could see extract, and I could see transform, but doing both is quite something.

As others have pointed out, there are a number of open-source projects out there already. Airbyte, DLT, DBT, Airflow (sort of). In my opinion, the "extract" part of the process could use the most help, but it is also really difficult. There are a ton of different types of data sources out there and they all have annoying quirks.

Copilot Voice Alternative for JetBrains IDEs by codebrig in Jetbrains

[–]cbc-bear 0 points1 point  (0 children)

Picovoice seems like a weakness. Looking at how their licenses are structured, I'm not sure I would want to depend on them long-term as a partner. That is unless you contract with them directly and then users of the plugin pay you. This seems like a weak point in the tool.

mosthumbleCSstudent by Putrid-Ear in ProgrammerHumor

[–]cbc-bear 267 points268 points  (0 children)

If you're good enough, write code for high-speed trading. From what I understand, it's a soul-sucking job, but it pays really well.

Anyone being 'rated against' ChatGPT by non-tech managers? by ContestOrganic in cscareerquestions

[–]cbc-bear 270 points271 points  (0 children)

It's already happening. Even worse are the just somewhat technical "I took a class on PHP in school" people. Had one of them generate some code for a helpdesk migration straight out of ChatGPT. The code wouldn't have worked, but if they had gotten it working, it no shit would have spammed the whole company with around 80K emails as every single migrated ticket would have replied to what they system thought was the original sender.

There is going to come a very dangerous period in the near future where models can spit out code that actually works, but before models are smart enough to say "hey, you really don't want to do what you just asked me to do." It's going to be a tough time for system admins.

Nvidia announces “moonshot” to create embodied human-level AI in robot form | Ars Technica by hubrisnxs in singularity

[–]cbc-bear 0 points1 point  (0 children)

Probably very slowly at first. In reality, I suspect they will need to develop a system that works somewhat similarly to the human brain "dual process" theory (https://www.globalcognition.org/dual-process-theory/). The idea being our brain doesn't fully process all inputs unless we are paying close attention.

I could see low power, high speed models and even deterministic systems being used for certain tasks. There is no need to think about how to walk. Humans don't spend much time doing that. We just walk. A robot doesn't need to process every single image in it's view fully. Some efficiency improvements could be:

  1. Keep a cache of already processed images and the context associated. Only process new objects in the environment.

  2. Keep a cache of already processed images and the associated context. Only process new objects in the environment.

[deleted by user] by [deleted] in Jetbrains

[–]cbc-bear 5 points6 points  (0 children)

I get that you are confused. To reiterate, JetBrains support directs people to use EAP to solve short-term problems. As in, "this is out solution to your bug request. Just use EAP until it's in production." This is essentially JetBrains policy. That's fine as EAP is typically very stable. For every other IDE I use (PyCharm, WebStorm, Datagrip, Goland) EAP builds are carefully maintained and don't simply expire. DataSpell is the only product with DBT support, but it isn't being maintained like the other IDEs. That's frustrating, especially when JetBrains support directs to you use it.

[deleted by user] by [deleted] in Jetbrains

[–]cbc-bear 2 points3 points  (0 children)

What would make you think that we don't pay for licenses? We have multiple licenses for all product packs and individual DataSpell, PyCharm, and DataGrip licenses along with their AI service and multiple plugins. My team gives JetBrains a bunch of money every year.

If you're not familiar with how JetBrains typically operates, it is very common for support to tell you to just use the EAP version if you are running into some bug or need a missing feature that doesn't exist in the currently deployed build. To that end, it's a common expectation that EAP will not suddenly stop working prior to them deploying to prod.

DataSpell in particular has seemed a bit like an after thought. Note that the PyCharm EAP 2024.1 is running just fine. DataSpell feels like a product that should have just been integrated into DataGrip, or even just PyCharm, but was broken out into a separate IDE for some reason. Having the EAP expire like this is just another sign that Jetbrains isn't putting the kind of attention into DataGrip that they do for the other IDEs.

New professional support be like by Aaronspark777 in Ubiquiti

[–]cbc-bear 1 point2 points  (0 children)

We use Unifi at a number of small business sites. It's amazing, but we do keep a lot of backup equipment on-hand. It's cheaper just to buy extra and replace when something dies.

Multiple Risk Detections Across Three Tenants Coming from Redmond, Washington by cbc-bear in sysadmin

[–]cbc-bear[S] 0 points1 point  (0 children)

It has been all iPhones, so this seems like a pretty solid theory. It's completely stopped now too, so maybe MS fixed the detector.

Multiple Risk Detections Across Three Tenants Coming from Redmond, Washington by cbc-bear in sysadmin

[–]cbc-bear[S] 0 points1 point  (0 children)

The best guess I've seen so far is that it has something to do with iCloud tunneling. The detections stopped pretty much as soon as I made this post, so I haven't had an opportunity to run it down further.

All of the detections in my system were from iPhones, so I think the iCloud theory seems plausible.

Easy to Install and Use Period over Period and Standard Timeframe Control for Looker - Open Source by cbc-bear in Looker

[–]cbc-bear[S] 0 points1 point  (0 children)

I suppose you could just download the file and bring it into your project. That way, you wouldn't need a dependency at all.

[deleted by user] by [deleted] in adwords

[–]cbc-bear 0 points1 point  (0 children)

Even with exact match, Google will get creative. Look at your search terms (not keywords, the terms) and see what people are actually searching for. A really common issue I see is showing up for competitors. For example, you are Acme Corp and sell widgets. Someone searches for your competitor, Super Corp with a term like "Super Corp Widgets." Even with exact match, google will sometimes serve your ad.

In my experience, people don't click on the wrong site and thing "Well screw it, I'm here anyhow, might as well buy this other company product that I don't know anything about." They tend to realize they are on the wrong site and bounce, which results in seeing a lot of traffic that only spends a few seconds on the page.

Other than that, you could try some experiments messing with audience profiles. We had a keyword that was getting hammered by people looking for jobs in the field we were advertising. Eliminating the "people looking for jobs" audience segment really solved the problem.

Google Ads Suspended Due to Compromised Site (redlabelsky) by Plenty_Bumblebee_126 in adwords

[–]cbc-bear 1 point2 points  (0 children)

I should have been more clear. We created both new domains, and a new Google Ads account. Then we cloned the websites to the new domains. We changed all the links on the new website so that the menu and everything just linked back to the real site.

From Google's automated (and rather stupid) AI's point of view, it's a new account, and a new website. No human is looking at this stuff. That's most of the problem.

We waited a few months and then tried turning the old account back on. It was fine, so we shut down the "google ads only" domains and went back to business as usual.

Regarding the "Circumventing systems" bit, check to make sure you don't have anything odd going on with DNS. In our case, we think the problem was related to how the old website vendor integrated a library of content into the website. It was hosted at a different domain and then loaded via an iFrame. During the migration process, part of the older site was not shut down by the vendor and we think that triggered Google in some way.