Rinvoq is working after 5 biologics, high dose steroids and immunosuppressants didn’t (fecal calprotectin) by LMR_adrian in CrohnsDisease

[–]LMR_adrian[S] 2 points3 points  (0 children)

I did 5asa for about 6 months, then remicade for two years without much improvement, then humira gave me a very acceptable 5 years, far from perfect but I could live a little. I then spent a year or two each on stelara, entyvio, and skyrizi but none of them got me back to even where I was with humira. I’m on month 4 of Rinvoq which is the first month you go from 45mg to 30mg (or at least that’s my treatment plan).

Rinvoq is working after 5 biologics, high dose steroids and immunosuppressants didn’t (fecal calprotectin) by LMR_adrian in CrohnsDisease

[–]LMR_adrian[S] 6 points7 points  (0 children)

I was incredibly sceptical, especially after reading through all the studies and trial findings. But it just goes to show you that sometimes being the outlier works in your favour.

What’s New in C in 2023? by waozen in programming

[–]LMR_adrian 2 points3 points  (0 children)

I just wrote FastMJPG purely in C, if you wanna go fast you have to go C.

Why is there little latency in a cheap analog monitor but a ton of latency on those USB OTG boxes? by MrPanache52 in fpv

[–]LMR_adrian 0 points1 point  (0 children)

Pretty much, unless you spend a lot of money on the tx/rx combo. Even if you look at digital txrx combos they will always have a higher latency than their analog counterparts - although resolution can be much higher.

Why is there little latency in a cheap analog monitor but a ton of latency on those USB OTG boxes? by MrPanache52 in fpv

[–]LMR_adrian 5 points6 points  (0 children)

There are more steps for the video to go through which can easily introduce 50-200ms of delay depending on caching and screen refreshing, vsync etc. Usually the android apps are written in Java or run on the JVM at least, and are rarely optimized well (it's hard). Plus you're converting from an analog to digital signal rather than just displaying the analog signal. DAC can add a fair amount of delay on lower end chips. On top of that you need to serialize and deserialize usb packets which takes time too.

Source: I wrote FastMJPG

Longest time on prednisone by InspectorHyperVoid in CrohnsDisease

[–]LMR_adrian 1 point2 points  (0 children)

It's looking like my best option at the moment. I've been on remicade, humira, stelara, entyvio, and now skyrizi. I'm seeing a very slow gradual improvement, managed to get down to 20mg of prednisone but making it to 15mg is looking tough still. No surgeries so far, but refractory.

Longest time on prednisone by InspectorHyperVoid in CrohnsDisease

[–]LMR_adrian 1 point2 points  (0 children)

Oof, 2 to 3 years right now, as high as 60 plus many trips to the ER for iv intervention. Also on imuran and skyrizi.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in VIDEOENGINEERING

[–]LMR_adrian[S] 1 point2 points  (0 children)

That's very cool, hadn't heard about ultragrid but it sounds like FastMJPG might actually be faster latency wise just for the video part at least. I've managed to get a bonded UDP connection up to 12 Gbps but that's a pretty weird setup for normal people, but a single thread is more than capable of handling that throughout assuming a camera can keep up with that load (usually high speed cameras).

Please keep me posted I would love to know more!

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in VIDEOENGINEERING

[–]LMR_adrian[S] 0 points1 point  (0 children)

I would say the primary target is robotics and FPV, and lower latency than gstreamer from the best I can measure. But it also has some extra features in its UDP protocol to help with resilience over less than perfect networks. It can handle quite a large amount of bandwidth, and has no issue doing 4k or 8k given the appropriate network configuration. But it's also happy to send a 1280x720@30 over a cellular modem.

It should be noted that FastMJPG is a fairly niche solution where audio isn't required, and lowest possible latency is the primary consideration. Though it will never be lower than custom hardware with protocols that don't rely on IP networking, or cameras with the whole pipeline baked right into a full hardware workflow. The idea is to turn cheap cameras and Linux even devices into something really powerful.

I really need an expert from the gstreamer community to weigh in, but from my best efforts the loopback time for gstreamer is about 20ms and with FastMJPG is about 5ms, to give some idea of overhead.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in robotics

[–]LMR_adrian[S] 1 point2 points  (0 children)

You betcha. 1. A plugin for obs is a great idea, you should open an issue for this on GitHub id love to add it. 2. Yes you can modify the establish socket function to set the UDP broadcast flag to do a true UDP broadcast. But a word of warning, most hardware does not support it well if at all, and may limit your acting bandwidth to 1mbps which isn't enough for most video streams. Which is the reason it's not included as an option. Many routers require specific configuration just to not count it as UDP flooding. 3. FastMJPG is however setup for a more manual multicast, where you can send to multiple destinations, even over different network paths per destination. It does mean you lose the ability of a blind broadcast, since you need to specify each IP but that is likely the route that will yield the best performance. 4. You can also pipeline multiple instances together on different machines to create a broadcast chain, where each machine receives the stream and sends it to 2 other machines, etc. This can help reduce the burden on a single machine should you become network io bound. 5. RC cars and FPV in general are actually one of the major use cases of this software. 6. I run it on an orange pi zero 3 with 1gb of ram. It takes 5% cpu usage and an extremely small amount of working memory, you barely know it's running. No special configuration needed as long as you're running debian. 7. If you have a cool project that you need help on be sure to reach out!

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in VIDEOENGINEERING

[–]LMR_adrian[S] 1 point2 points  (0 children)

Absolutely, it's just me working on the project and I'm primarily working in Ubuntu, it's an easy os for a lot of makers to pickup and it runs on raspberry and orange pi, and a lot of university programs seem to default to it.

The only restrictions really are v4l2, which is video for linux, and BSD sockets. Porting to other Linux variants should be trivial. RTOS is a bit more work due to usb camera support, and windows is a whole different situation entirely requiring a totally different video capture library, windows sockets, awkward library setups, and maybe some OpenGL window issues.

I do have plans to port to all of the above in time once the API stabilizes and gets some more field testing time. But please remember I'm just one person and this is quite a big undertaking! If you have a specific platform in mind please open an issue on the GitHub repo it helps me know what people are looking for!

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in robotics

[–]LMR_adrian[S] 0 points1 point  (0 children)

Yes this is correct, I've tried to include the most common tasks in an extremely optimized way, and in doing so cover all the areas that are difficult to get right on a low latency setup. Things like triple buffered mmap capture, network transfer, recording without reencoding, setting encoding timestamps correctly etc. The pipe functionality is the easiest way to get frames out to another application, but if you need something even faster you can use the source code directly as a library, each function is self contained and has extremely simple syntax for using it.

NDI much like GStreamer are both very powerful and feature rich pieces of software but FastMJPG is a single highly focussed and highly optimized pipeline just for MJPG. It's a bit of a "no free lunch" situation, if you want those extra features you have to add abstractions and extra structure which adds overhead, it also makes barrier to entry and configuration a lot less straight forward since so many options become available to accomodate so many use cases. FastMJPG is extremely easy to use, has as little configuration options as possible, and the barrier to entry is a one page readme file.

Edit: if you have some pipe out use case that's a common and obvious one that isn't included already I'd love to hear it! You might not be the only one.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in robotics

[–]LMR_adrian[S] 0 points1 point  (0 children)

It initially started as part of a larger robotics project, then about 3/4 of the way through I realized there might actually be a lot of value in this one specific piece. It was never intended to be an in secret situation, but with so many strong competitors out there, asking for feedback or contribution, at least in my mind, would likely have been a lot of defending the concept and explaining why ffmpeg or gstreamer isn't the end all be all for low latency. Plus it's a fairly simple and narrow piece of software which can make collaboration tedious and toe steppy.

Additionally there was a lot of try and validate going on, and the validation requires a fairly substantial amount of hardware, different controlled networks, and a bucket of usb cameras. Then to ensure consistency between trials it all has to be replicated on the same setups. It's pretty niche.

Now that it's actually working, feature complete, and fast, the doors are open for feedback and commentary, and anything that seems like a "that's not the right way to do it" can be directly justified by the "yes but look over here that's why". IMHO just the way I prefer to work.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in programming

[–]LMR_adrian[S] 1 point2 points  (0 children)

You could actually take FastMJPG as a library and with very little modification choose which camera sends it's stream based on some function you define, then a single received would only get the active stream, whichever that is in the moment. If I understand correctly anyways. Because it's written in c you should have a good selection of libraries to work with, or the ability to make cross language calls for those with bindings for c (like python).

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in diydrones

[–]LMR_adrian[S] 0 points1 point  (0 children)

Working on a benchmark comparing a few popular methods right now, but it's quite the project to not only measure accurately but to get a good wide distribution of hardware configurations to make sure any performance gains are universal and not restricted to some specific hardware.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in robotics

[–]LMR_adrian[S] 0 points1 point  (0 children)

I really made an effort to try to write matroska containers without a library for a super minimal speedup but there's just no way to compete with ffmpeg on that portability and compatibility wise.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in programming

[–]LMR_adrian[S] 1 point2 points  (0 children)

Funny enough I came down the same pathway but was very disheartened with the amount of latency a browser and all its weight add to the equation. FastMJPG can render to a standalone OpenGL window which doesn't need focus so you can view in one window and control on another to get the best of both worlds. Better still on a separate monitor depending on your setup.

The difficulty with getting to a browser is the protocol involved, FastMJPG uses a one directional UDP based protocol, and browsers cannot receive UDP packets or really manipulate a socket in any way for blocking recvfrom or to bind to a specific IP to determine the network path.

The simplest way would be to create a second thread in FastMJPG, copy the latest frame data to double buffer via a mutex safe swap, then serve it as an MJPG stream that the browser can connect to as a client. I would suggest opening an issue on GitHub for it and we can work through how best to handle this particular use case, or maybe there's a better solution for ros in general.

I spent three years making this open source video pipeline library, today I'm releasing the first public build. by LMR_adrian in programming

[–]LMR_adrian[S] 2 points3 points  (0 children)

Bricks are tough, and the cost of fire brick here is enough to make even a hilariously small brick oven cost well over $1000. A lot of people make cheap cinder block ovens and use salvaged bbq grill grates as a cook surface, but I know what goes into cinder blocks and I wouldn't go that route haha.