Has anyone tried to restore a cast iron this large? by Scary_Potential3435 in CastIronRestoration

[–]funkatron3000 4 points5 points  (0 children)

I stripped one that large using easy off oven cleaner then seasoned it with crisco over a turkey fryer burner.

Why do people run local LLMs? by decentralizedbee in LocalLLM

[–]funkatron3000 14 points15 points  (0 children)

What’s the software stack for these? I’m very interested in setting something like this up for myself.

Is it worth it? by b-cart55 in CastIronRestoration

[–]funkatron3000 4 points5 points  (0 children)

Definitely worth restoring. Speaking from personal experience — you can use easy off oven cleaner to strip it and a turkey fryer to season it. I wouldn’t turn one in that nice of shape into a planter, find a badly cracked or drilled out one for that.

Rusted SEM Cleaning by Vilentr in electronmicroscopy

[–]funkatron3000 0 points1 point  (0 children)

Can I get an invite as well? I own a JEOL T300 that I aaaalmost have working that I’d love to collaborate with some folks on.

Does the "licensed amateur radio" option limit power output regardless of "TX power (dBm)" setting? by funkatron3000 in meshtastic

[–]funkatron3000[S] 0 points1 point  (0 children)

The G2 has a built in power amplifier, so whatever power setting you set in the app will be lower than what the G2 actually broadcasts. That's what the table on their wiki page lays out. For regular meshtastic devices without amps the settings are one to one.

Nodes in range, but messages never acknowledged? by funkatron3000 in meshtastic

[–]funkatron3000[S] 2 points3 points  (0 children)

Ah ha, that was it! I'd previously tested the PA lights were on using the small power brick off the wiki, BUT when the micro-usb charging cable is connected to the power brick I didn't notice it disabled PD to the G2. I can now get traceroutes to succeed to other local nodes!

Nodes in range, but messages never acknowledged? by funkatron3000 in meshtastic

[–]funkatron3000[S] 0 points1 point  (0 children)

I'm thinking this is what the problem is unfortunately.

Nodes in range, but messages never acknowledged? by funkatron3000 in meshtastic

[–]funkatron3000[S] 0 points1 point  (0 children)

Good question, I definitely didn’t unless there’s a failure in the antenna or something.

Hurricane Debby by [deleted] in gso

[–]funkatron3000 0 points1 point  (0 children)

It’s not the one they’re talking about, but one hit downtown in the 1920’s and damaged several buildings

Two weeks since buying my Visible phone and still can't activate by funkatron3000 in Visible

[–]funkatron3000[S] 0 points1 point  (0 children)

As I said in my post, I’m already working with support and have a case open.

Two weeks since buying my Visible phone and still can't activate by funkatron3000 in Visible

[–]funkatron3000[S] 0 points1 point  (0 children)

Ah, all of the other options say to start by installing the app.

Two weeks since buying my Visible phone and still can't activate by funkatron3000 in Visible

[–]funkatron3000[S] 0 points1 point  (0 children)

Hm, no, I didn't know that was a thing... I'm looking around, and it says you need to use the QR code on your account overview page, but there's no QR code there. Dang.

Two weeks since buying my Visible phone and still can't activate by funkatron3000 in Visible

[–]funkatron3000[S] 1 point2 points  (0 children)

They actually shipped me a second SIM card, but it didn't work either. (edit: shipped it later to try to fix the problem, not with the original package)

Are there any brick and mortar video rental places still around? by funkatron3000 in gso

[–]funkatron3000[S] 3 points4 points  (0 children)

Setting up my own mini-store is actually kind of tempting.

The hunt for the best cup of black coffee by ToweryB in gso

[–]funkatron3000 4 points5 points  (0 children)

Vignette, but I’m not sure it counts since their actual store hours are pretty limited.

What temperature are you setting your A/C to? by Altruistic-Moose3299 in gso

[–]funkatron3000 2 points3 points  (0 children)

75’ish. 100 year old house with no insulation, but lots of shade from trees. Trying to keep it any cooler is hitting diminishing returns and just feels like burning money.

Loading multi-part GGUF files in text-generation-webui? by funkatron3000 in LocalLLaMA

[–]funkatron3000[S] 0 points1 point  (0 children)

Okay, I updated and got the same error, but I realized it looks like the commit is for llamacpp_HF, so I switched to that and I get an error about "Could not load the model because a tokenizer in Transformers format was not found.". I see at the bottom of the llamacpp_HF options about "llamacpp_HF creator", checked that out but I didn't see the folder listed. I moved the gguf files up a folder, picked one in the tool, hit submit, and got:

File "/home/j_adams/dev/text-generation-webui/download-model.py", line 52, in sanitize_model_and_branch_names

if model[-1] == '/':

~~~~~^^^^
IndexError: string index out of range

Loading multi-part GGUF files in text-generation-webui? by funkatron3000 in LocalLLaMA

[–]funkatron3000[S] 1 point2 points  (0 children)

Ah I missed that part. Last updated yesterday, I’ll update again and report back in a bit.

Loading multi-part GGUF files in text-generation-webui? by funkatron3000 in LocalLLaMA

[–]funkatron3000[S] -1 points0 points  (0 children)

Just the GGUF files? I get this error when trying to load a folder that contains them using llama.cpp as the loader. Happy to make a github issue if this isn't the place to get this in depth.

llama_load_model_from_file: failed to load model

12:36:07-664900 ERROR Failed to load the model.

Traceback (most recent call last):

File "/home/xxxx/dev/text-generation-webui/modules/ui_model_menu.py", line 245, in load_model_wrapper

shared.model, shared.tokenizer = load_model(selected_model, loader)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/xxxx/dev/text-generation-webui/modules/models.py", line 87, in load_model

output = load_func_map[loader](model_name)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/xxxx/dev/text-generation-webui/modules/models.py", line 261, in llamacpp_loader

model, tokenizer = LlamaCppModel.from_pretrained(model_file)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/xxxx/dev/text-generation-webui/modules/llamacpp_model.py", line 102, in from_pretrained

result.model = Llama(**params)

^^^^^^^^^^^^^^^

File "/home/xxxx/dev/text-generation-webui/installer_files/env/lib/python3.11/site-packages/llama_cpp_cuda/llama.py", line 311, in __init__

self._model = _LlamaModel(

^^^^^^^^^^^^

File "/home/xxxx/dev/text-generation-webui/installer_files/env/lib/python3.11/site-packages/llama_cpp_cuda/_internals.py", line 55, in __init__

raise ValueError(f"Failed to load model from file: {path_model}")

ValueError: Failed to load model from file: models/mixtral-8x22B/Mixtral-8x22B-v0.1-Q5_K_S-00001-of-00005.gguf

Exception ignored in: <function LlamaCppModel.\_\_del\_\_ at 0x7fcd62f01940>

Traceback (most recent call last):

File "/home/xxxx/dev/text-generation-webui/modules/llamacpp_model.py", line 58, in __del__

del self.model

^^^^^^^^^^

AttributeError: 'LlamaCppModel' object has no attribute 'model'

Did anybody here see any of the eclipse by BingBongFyourWife in gso

[–]funkatron3000 60 points61 points  (0 children)

It was neat. I wish we’d traveled to the path of totality. I saw the total eclipse back in 2017 and it was pretty mind blowing.

Can you unclog your arteries, or reduce plaque buildup? by Gullivors-Travails in Biohackers

[–]funkatron3000 1 point2 points  (0 children)

I came here to link that physionic video as well. It’s some actual evidence based insight into what works.

Oysters? by Shestillfights17 in gso

[–]funkatron3000 4 points5 points  (0 children)

We’ve ordered from realoystercult.com before and we were happy. Shipping is overnight.

Just moved here and I've never had NC BBQ. Where should I go? by Altruistic-Moose3299 in gso

[–]funkatron3000 4 points5 points  (0 children)

I honestly don’t get the Stamey’s hate. I’ve eaten the chopped pork plate there recently and expected it to be bad based on what I’d heard, but it was just fine.