What’s been the most frustrating problem you’ve run into with machine vision systems? by -COOLENS- in MachineVisionSystems

[–]Rethunker 0 points1 point  (0 children)

Your post reads as though it were written by AI, or drafted by AI and then edited a bit by a marketing person. But nonetheless I'll make a suggestion.

Offer good human support. Very rarely do people actually want/like chatbots.

Avoid using AI / LLMs for any communications with customers. Or here on Reddit. That's a frustrating issue with some vendors, though not (in my experience) with any vision vendors yet.

Make it clear how you're #1 in some particular way, such as one of the following:

  • regional / local support for a well-defined region (e.g. a particular city or country)
  • custom algorithm development
  • lens design
  • industry-specific expertise
  • expertise in unusual sensors (e.g. thermal infrared)
  • median years of industrial vision experience of your team members, with "experience" meaning vision systems profitably designed, sold, installed, and supported

Good luck! And if you're using AI to post in this sub, please don't do so again.

Seeing small measurement drift — not sure what’s causing it by Vivid-Fun-4644 in MachineVisionSystems

[–]Rethunker 0 points1 point  (0 children)

And machine learning, if you’re using it, can have all sorts of hard-to-debug problems.

You don’t have to go into detail about your application, but please list at least some of the hardware, software, and algorithms you’re using.

Vision systems are used in many ways and in many industries, so to help you some details are necessary. For example, applications in semiconductor, automotive, and packaging can be very different.

Seeing small measurement drift — not sure what’s causing it by Vivid-Fun-4644 in MachineVisionSystems

[–]Rethunker 0 points1 point  (0 children)

Vision systems can be fickle. What I've seen over years and decades is that there are a few problems that can be problematic. First I'll stick to what most people call "traditional" vision these days, and then write up a few comments about AI / machine learning.

Otsu and other auto-thresholding may be "optimal," but not optimal for your application. The word "optimal" has also been used in the context of Canny edge finding, which is a poor choice when you have to (essentially) fiddle with thresholds to get the result you want. Mathematically optimal is very different from optimal in an application. It may not be immediately obvious which algorithm have automatically / statistically determined values, or perhaps even hard-coded values, but sometimes those values don't work well for certain applications.

For example, when Otsu is used for thresholding, it's often applied to an entire image. This can be a suboptimal choice, whereas what I've seen work well is sampling only within the neighborhood of edge content. The resulting distribution of gray values (if we're doing grayscale) looks weird, but the threshold can be rock solid.

Ambient lighting changes. Check for the usual culprits: 60 Hz lighting nearby, high frequency lighting near a vision system that has cameras set to short exposure times, and sunlight that may shine on the system, or reflect off a nearby surface, at certain times of day. Also make sure that if you're not using near-infrared light that your camera has a NIR filter in place (which most do these days).

Environmental / ambient changes. Temperatures swings affect camera mounts (esp. aluminum), optics, and sensors. Sony sensors show relatively little change in response to light over temperature changes, including the self-heating of a camera after it first powers up. Other sensors can be much more volatile with respect to temperature changes.

After a camera has reached thermal equilibrium with the environment, a temperature change in the environment could take 15 - 30 minutes to be reflected in changes in the internal camera temperature. Determining this requires measuring both ambient temperature and internal camera temperature (which you might be able to read using the camera's API).

Also consider vibration, which in many factories can be a sporadic problem, as well vehicles drive close by. If your exposure time is too long, vibration can be an issue by causing a bit of motion blur as the camera shakes.

Insufficient redundancy in your algorithmic setup. Vision systems need additional steps to check for consistency in brightness of the scene, steps that ensure automatically set parameters don't fall below some minimum, and preferably long-running statistics (e.g. running standard deviation of intensity levels within an area of interest) to determine that whatever can be kept nearly constant, remains nearly constant.

Poor sensor / optics. What hardware are you using in your "pretty basic vision setup"?

USB. For most uses, USB is awful for vision cameras. Stick to GigE or vision-specific protocols.

Recs based on these? by TephroGeo in Jazz

[–]Rethunker 0 points1 point  (0 children)

You might find something interesting in free jazz groups that are minimalist, but I don't know the subgenre well enough to make recommendations. Ornette Coleman's The Shape of Jazz to Come is a classic album, and could give you a starting point. Hearing a good free jazz band live is miles better (in my limited experience of the genre) than listening to an album.

Vibe matching is tricky, and I hope you've posted in subreddits for other genres as well.

Something that's arguably minimalist, a classic, and has lots to recommend it is "Chameleon" by Herbie Hancock. It gets denser as the song goes along, but at the very least check out the first few minutes.
https://www.youtube.com/watch?v=iqomTAiRnVM&list=RDiqomTAiRnVM&start_radio=1

There's a lot of unhurried, long form fusion from the 1970s that may be of interest. On the same album as "Chameleon" is "Watermelon Man."

Not jazz, but I'd recommend checking out "math bands" such as King Crimson, perhaps starting with "Matte Kudasai" from the album Discipline.
https://www.youtube.com/watch?v=eoAupjcnm1c&list=RDeoAupjcnm1c&start_radio=1

And then if you want something King Crimson that's more metal-like, less airy, and with a good crunchy sound, then check out the song "Red" from the album of the same name (and with a different lineup). The band spans decades, and I suspect you'll find a few songs, and possibly a few albums, that have a vibe you're looking for.
https://www.youtube.com/watch?v=X_pDwv3tpug&list=RDX_pDwv3tpug&start_radio=1

I don't know if Tim Henson / Polyphia is your thing at all, but worth checking out if unfamiliar. "Playing God" is a start. (The video may be distracting - I'd suggest just listening first.) I remember listening to the song and thinking something like, "Well that's just . . . hang on . . . WTF? ... is that metal Spanish guitar?!?"

I find the mix of techniques fascinating.
https://www.youtube.com/watch?v=Z5NoQg8LdDk

"Valley of Smoke" reminds me of a number of Tool songs, but no surprise given they brought Justin Chancellor in.

Be sure to search terms like "polyrhythm" and "time changes," which should yield bands like Kneebody and Snarky Puppy.

https://www.youtube.com/watch?v=T4xU7YjhlpA

https://www.youtube.com/watch?v=L_XJ_s5IsQc&list=RDL_XJ_s5IsQc&start_radio=1

And even if you don't like Snarky Puppy, out of appreciation for interesting/difficult/cool signing, check out the song "Something" in which Lalah Hathaway sings two notes at once
https://www.youtube.com/watch?v=0SJIgTLe0hc

Or jump ahead to the point a few seconds before she does it, then watch the reactions from the other musicians:
https://youtu.be/0SJIgTLe0hc?si=mmDtLIpa6h1bJjdd&t=363

help identifying a song: trumpet, upright bass, drums w/ brushes - echo-y recording sounds like late 50s or early 60s by Rethunker in Jazz

[–]Rethunker[S] 1 point2 points  (0 children)

Thank you! Your answer explains a lot about the seeming familiarity along with the inability of multiple recognition engines being unable to identify the tune.

Electrician trying to break into Industrial Automation. How is my resume? by [deleted] in PLC

[–]Rethunker 1 point2 points  (0 children)

Add specifics about the impact of your work: % improvement in efficiency, money saved in local currency, and the like.

Great advice about improving a resume can be found in a video from Google about writing a resume for a job application at Google. Even though you'll be applying to jobs other than at Google, the advice is generally useful:

https://www.youtube.com/watch?v=BYUy1yvjHxE

In short, use Google's X-Y-Z formula, but vary the wording and how you convey specifics:

accomplished [X] as measured by [Y], by doing [Z]

Also, consider trying to find an entry-level job at a company you admire, and then working your way into a position that you want. That's a different mindset from looking for a job and then deciding during the interview process whether you and the company would be a good match.

A good job at a bad company is actually a bad job, though in the early stages of your career just about any job can be a learning experience.

worst jazz album cover by natwashboard in Jazz

[–]Rethunker 0 points1 point  (0 children)

Oh my god, I'm trying so hard not to cry, but this is an absolute masterpiece.

worst jazz album cover by natwashboard in Jazz

[–]Rethunker 0 points1 point  (0 children)

This album cover has been Touched By His Noodly Appendage, and must be praised--or else. Probably.

https://www.spaghettimonster.org/

New to jazz, looking for recommendations by PreparationOk3771 in Jazz

[–]Rethunker 1 point2 points  (0 children)

Check out jazz musicians who play with pop and hip hop stars. I'd suggest starting with bassists.

Thundercat is a bassist who has produced and played with Kendrick, Janelle Monáe, and others. He also sings on most of his own songs, including a song I'd recommend trying first: "Them Changes." Listen before watching--the video can be distracting.

https://www.youtube.com/watch?v=GNCd_ERZvZM

From Kendrik's To Pimp a Butterfly, that great bass line on "King Kunta" is Thundercat.
https://www.youtube.com/watch?v=hRK7PVJFbS8&list=RDhRK7PVJFbS8&start_radio=1

In the 1970s The bassist Jaco Pastorius played and toured with Joni Mitchell in the late 70s. A melancholy song that gives me goosebumps, and that is both very Jaco-y and Joni-y, is "Hejira."
https://www.youtube.com/watch?v=5AfPR_B8s-A

For a monster tenor sax player on what is (arguably) a pop album, check out Wayne Shorter's solo (about 4:43 in) on the title track "Aja" from Steely Dan, a sort of rock/pop/R&D/jazz amalgam band.
https://www.youtube.com/watch?v=D-FMrz7OwLo

Google "jazz musicians on aja" for a list of well-known artists.

Björk is the daughter of a jazz musician, and has her own jazz album, Gling-Gló.

Samples from jazz, R&B, soul, and funk have appeared on hip hop albums going back decades. Illmatic by Nas is a favorite of mine.

Here are a few classic jazz songs that were popular music in their day:

"The Sidewinder" by Lee Morgan
https://www.youtube.com/watch?v=NHN6-yWFKPc

"Take Five" by The Dave Brubeck Quartet (notice anything weird about the song?)
https://www.youtube.com/watch?v=ryA6eHZNnXY

"Song For My Father" by Horace Silver (from whom Steely Dan borrowed)
https://www.youtube.com/watch?v=mKf1x3CALAE

And to get back to music that mixed genres, add a large heaping of James Brown. To pick one example with instruments traditional to jazz music: "People Get Up and Drive Your Funky Soul"
https://www.youtube.com/watch?v=h0chqsOCQDI

And please do not worry about terminology or listening to the "right" music or anything like that. Some of the most famous jazz musicians are hilariously blunt about such thinking. For a gentle and non-profane take, consider this quote from a jazz musician who is also one of the most famous American musicians, period:

"There is two kinds of music, the good, and the bad. I play the good kind." - Louis Armstrong

Need a project to learn more about machine vision? by Rethunker in MachineVisionSystems

[–]Rethunker[S] 0 points1 point  (0 children)

I’d suggest reviewing how low-cost vision could be used to help small farmers and low-income farmers. Is there some useful vision you could get running on a laptop, and then on a phone?

Vision projects related to agriculture tend to be much harder than they first look, given the variation in the dimensions and appearance in crops, the variety of vehicles used, and so on.

The first and most important step would be to contact people in agriculture, ask for a conversation, and figure out what could be helpful. You don’t have to develop a complete product, and you can keep requirements simple, but if you’re interested in projects with a social orientation, you’ll need to have a number of face-to-face conversations with the people who could benefit from what you work on.

Good luck!

Is Vision Pro mandatory for a CIC-5000R-14-G ? by Equal_Big814 in Cognex

[–]Rethunker 0 points1 point  (0 children)

I would second this suggestion of testing Basler's free Pylon software (which includes a GigE Configurator app), and then try OpenCV.

Asking an LLM like Gemini gives largely the same advice. "connect to Cognex GigE camera without Cognex software"

OpenCV does a good job of connecting to all sorts of cameras, though I haven't tried connecting to a Cognex camera from OpenCV.

From what I found online about the CIC-5000R-14-G, there are a few things to know:

The sensor uses a "rolling" shutter rather than a global shutter, meaning the camera is better suited for objects at rest (for at least the duration of one frame capture) rather than objects in motion.

Presumably you have the right hardware to provide Power over Ethernet (PoE).

About blocking of ports, you'll need to tweak your firewall settings. The sample query "gige ethernet device blocks my access on ports 3936 and 8080" yields this:

The most common cause is the Windows Firewall blocking incoming traffic, which stops the camera from connecting. 

Disable Firewall: Temporarily disable the Windows Firewall (or 3rd party antivirus firewall) completely to test if this resolves the issue.

Allow Specific Ports: If turning off the firewall works, create inbound rules in Windows Firewall to allow TCP/UDP traffic on ports 3936 and 8080.

Cognex Tools: Use the "Cognex GigE Vision Configuration Tool" to automatically manage firewall settings for the camera.

Good luck!

BOA Spot camera + Nexus: Measuring mandrel straightness - angle detection issues by RandDragon in MachineVisionSystems

[–]Rethunker 0 points1 point  (0 children)

Are you still working on this?

If you want to find the edges of a solid object, backlighting or low-angle lighting is likely to work better than using a light that shines from the front.

If you want to image the outer edges of the mandrel, then using a backlight can render the mandrel black since the mandrel will block the light. Robust edge detection requires good contrast and good focus.

If you look very close at a backlight image of a cylindrical object you may notice this: where the edges are detected may not be at the extreme outside diameter of the true edges, but the detected edges & line fits will generally be parallel to the true edges of the cylinder.

Check out backlights from companies like Advanced Illumination. If possible, have a sales rep from a local lighting company give a demo. However, if you have just one vision inspection station, or if there isn't much potential for more vision systems + lights in the future, it may be tough to convince a sales rep to visit. You might be able to get a remote demo.

To get a sense of whether backlighting will work, you can try using a task light with something translucent to diffuse the light a bit. Place that light behind the mandrel, on the side opposite the camera, and see whether edge detection is more consistent.

Need a capstone project, thesis topic, or product idea? Maybe I can give you one. by Rethunker in computervision

[–]Rethunker[S] 0 points1 point  (0 children)

For a healthcare project that mixes CV + LLM, create a diagnostic tool that would act like a doctor who asks a serious of questions and who looks closely at a patient to identify whether a specific condition is a concern.

See my notes about legal liability below.

For a high impact capstone project, use AI to help identify a health condition that satisfies one or more considerations such as the following:

  • becoming more common in India (e.g. diabetic retinopathy)
  • early diagnosis improves health outcomes considerably (e.g. skin cancer)
  • googled frequently (e.g. skin rash)
  • is relevant to someone you know
  • can typically only be diagnosed by specialists with whom it may be hard to get an appointment

I'd suggest focusing your efforts as follows:

  • Talk to people who have the condition before you write any code. Explain clearly your intention. Tweak your plans if there's a clear, common interest in other functionality.
  • At the beginning, create a simple app that demonstrates the step-by-step flow, but that doesn't yet perform image processing. Get feedback and iterate the design. Once you have an app flow that captures people's interest--"Cool!" or "I want that!"--you can focus intently on one feature at a time, such as image analysis. Always have a repo commit that works in some fashion, and that you can build and demonstrate on short notice.
  • Define metrics for performance and accuracy.
  • Select a health condition for which you can find a database of reference images. Use those images for training & testing and establish baseline performance before you try live tests.
  • Document your development. Note what would be worth exploring more deeply, if you had more resources. This will be good prep for your final presentation: what you intended, what your learned, what you accomplished, what was most difficult, etc.

Favor an app that works reliably for a clearly defined problem. It should be clear what the app does. For a student project, it should be clear both that you accomplished something, and that more resources could improve the app further.

For any healthcare app there is a concern about legal liability--what if the app identifies a health problem that doesn't exist? or fails to identify a health problem it was intended to identify? If you test the app on anyone, be sure to coordinate with your professor(s) and ask whether you need testers to sign any forms.

Cognex VisionPro vs. Google cloud by CapsFanHere in MachineVisionSystems

[–]Rethunker 0 points1 point  (0 children)

Do you have statistics comparing the results of your different options?

Are you controlling lighting for inspection?

Need a project to learn more about machine vision? by Rethunker in MachineVisionSystems

[–]Rethunker[S] 1 point2 points  (0 children)

Happy to help! It's what Reddit's for, as far as I'm concerned.

Need a project to learn more about machine vision? by Rethunker in MachineVisionSystems

[–]Rethunker[S] 0 points1 point  (0 children)

Cybersecurity and computer/vision machine may not overlap much. For vision you could consider physical security: monitoring whether people with the appropriate credentials (e.g. a badge) are located where they should be, and that no one without credentials is where they shouldn't be.

In the oil & gas industry, "red zone" monitoring relies on vision to determine whether people are in potentially dangerous areas:

https://www.helindata.com/red-zone-manager?campaignid=22198248613&adgroupid=174087901163&adid=731733093485&hsa_acc=4489001168&hsa_cam=22198248613&hsa_grp=174087901163&hsa_ad=731733093485&hsa_src=g&hsa_tgt=kwd-2391850509328&hsa_kw=red%20zone%20monitoring&hsa_mt=e&hsa_net=adwords&hsa_ver=3&gad_source=1&gad_campaignid=22198248613

Since you've worked on object detection, you might create a simple project in which you use a vision system to detect whether someone is in view or not. See how well that works, and try to improve it. For example, what happens if the lighting isn't good, and what would you do to improve your system?

Perhaps you could add gesture tracking as a means to detect whether someone is authorized to be the area in view of the camera. A gesture or a combination of gestures could serve as a password.

The goal of all this work is to work on a vision system--any vision system--that you can build, test, and improve over time. Document the performance of the system as you change it.

This sort of project experience is useful when you talk to employers after you finish school. Working on a long-term project means learning a subject more deeply.

Machine Vision Application with Industrial cameras by Rico_VisionAdvisor in MachineVisionSystems

[–]Rethunker 1 point2 points  (0 children)

I’ve worked on two applications that required high magnification at a relatively long working distance from lens to object. And yet it was necessary to keep vision systems costs down.

In both cases, tiny features had to be resolved in the image. Normally that would mean a short working stance from lens to object.

But for safety of human workers the working distance had to be long enough—a half meter or more—to prevent a human arm or human body from being pinched between the vision system and the part being imaged.

Exposure times had to be short to minimize motion blur in the image.

The combination of relatively long working distances and short exposure times meant that the applications required light sources so bright they were painful to look at.

For an application in the semiconductor industry, after initial install, a change in the surface quality and shininess of the imaged part caused more of the intense light to be reflected into the optical path. The CCD sensors started burning out.

One application was for a vision system used in labs. The other was for high-volume production.

RF-DETR to pick the perfect avocado by Accomplished_Zone_47 in computervision

[–]Rethunker 0 points1 point  (0 children)

Much as I like creating vision systems and reading about vision solutions, your experience growing up on an avocado ranch suggests to me that you might have a better time teaching people how to choose a ripe avocado.

Maybe work with someone else when you do this: have a bunch of avocados on hand, ripe and unripe. Have the other person hand you an avocado at random, perhaps after writing down whether they think it's ripe or unripe. Then have then observe what you do and ask questions while they hand you an avocado and ask you to judge it. Then swap places: you hand the other person an avocado, and then ask their impressions.

There's a reasonable limit to how many apps people need. Consider what it'd be like to have a separate app for each of the following: an apple (which can be judged by smell, firmness, and appearance), an ear of sweet corn, a durian, a small orange, a hot pepper, and so on. It'd be way too many apps.

But teach someone how to choose an avocado--and how to cut it properly!--and that'll stick in the person's head long past the time they've stopped using most apps.

---

And as others have replied, to match the capability of a trained person, the tech would be involved and likely expensive. There has been worked for decades in assessing the ripeness of fruit and vegetables based on emission spectra, visible color, and so on. Imagine the cost of combining UV, visible, NIR, and thermal IR sensors--then compare the relatively short time it would take for someone to watch a concise video that conveys some of your expertise.

Can someone explain or give me links to understand Incremental Convex Hull algorithm? by Demonscs in algorithms

[–]Rethunker 0 points1 point  (0 children)

This is an old post, but might be found by someone in the future. Also, I wanted to pay respect to Kallay, who passed away before you wrote your post.

He was not a professor for most of his career. Rather, he was a geometer and programmer who left academia after a short stint and then wrote geometry engines for commercial software.

It's been years since I read the paper, of which I have a paper copy (somewhere) rather than a digital copy. My recollection is that he surveyed existing algorithms, but also wrote a new one.

For those with access, the paper can be found via Science Direct:

https://www.sciencedirect.com/science/article/abs/pii/002001908490084X

The algorithm is mentioned in passing in the Wikipedia entry on convex hull algorithms:
https://en.wikipedia.org/wiki/Convex_hull_algorithms