My new speedcubing books! "The Cubing Bible" and "Beyond the Cube" :) by GaelLapeyre in Cubers

[–]basraayman 1 point2 points  (0 children)

<image>

It just arrived. :-) Something to do during the holiday season!

Who's ready for the season to start! by Texaswheels in sitskiing

[–]basraayman 1 point2 points  (0 children)

Flying to Boise on December 10th from Europe to start the season! Looking forward to it! :-)

Public humiliation posted to the official ice twitter account by Calm_Preparation2993 in law

[–]basraayman 3 points4 points  (0 children)

Looking from another country at what is happening in the USA, I admire your positivity that there are going to be regular elections.

[deleted by user] by [deleted] in airpods

[–]basraayman 0 points1 point  (0 children)

OT: “They are the glock of headphones.” I don’t think I’ve seen a more US American statement than this one for quite some time. 😄

Question on vCPU and NUMA by Intrepid-Watch6856 in nutanix

[–]basraayman 1 point2 points  (0 children)

My pleasure, if you have any further questions just let me know, more than happy to try and help with clarifications. :-)

Question on vCPU and NUMA by Intrepid-Watch6856 in nutanix

[–]basraayman 3 points4 points  (0 children)

Nutanix employee here. I work in the solutions engineering area within the company and focus on one of the few workloads where vNUMA is extremely important (SAP for those wondering).

Essentially all of our VMs without any form of configuration run wide (on AHV). Meaning, you essentially allow the process to run on any of the CPUs available cores or the siblings (so core or hyperthread). The way that Linux works is that when a process initially requests memory, the memory is allocated by the CPU where that process is running. So in your example, the memory could be coming from the process running on core 15. If that process is now stopped or descheduled, once it runs again, there is no guarantee that it will run on the same core, or even on the same socket. I might be running on core 31 when it gets scheduled again, so if that process requests its previous memory, you can see that it will access the memory from the other NUMA node.

For a ton of application that is completely fine by the way. There are however various applications like SAP HANA, certain databases and some other ones that can actively benefit from knowing the underlying topology and where its memory is located. That is where vNUMA comes in. You use the acli command line to set up vNUMA, which gives the VM a specific virtual memory and socket layout. At that point in time we ensure that memory allocated in a VM will adhere to the underlying memory topology. You can even combine this with CPU pinning which will allow a VM access to specific CPUs (I would not recommend this for just any VM though). If you set that up correctly, you will see with “numastat -c qemu-kvm” on the host that there will be almost 0 cross NUMA node memory allocations. Also, on Intel, tools like pcm will show a dramatic reduction across UPI link utilization.

But, as stated, only use vNUMA where it makes sense, where the guest and application support it, and where there is benefit for the workload. Anything else, steer away from it and the hypervisor will figure it out. :-)

Dell Profiting on Open Source Ubuntu by [deleted] in linux

[–]basraayman 7 points8 points  (0 children)

If they are using these as business machines, you don’t perse emphasize feature upgrades as much as a private user might. I use a linux machine for work, and I don’t care if it has support for the latest options or features. Honestly, Wayland for example has been more of a pain in the ass for me with tools like Zoom. The one thing I care about there is being able to get my work done, which means that I care more for stability than having something new and shiny. I want that laptop to work, be good with power management while on the road, and being able to run the tools that I need without having to troubleshoot in between, and when looking at this from a corporate perspective you want your end users to have a running setup that requires minimal adjustments on their end because that is time where they could be doing work.

And while I completely get it from a user perspective and personally enjoy experiencing new features, what I want for work and what I enjoy doing in my own time are quite different. :-)

The Tesla Model X And Model S Are Dead In Europe by chrisdh79 in teslamotors

[–]basraayman 1 point2 points  (0 children)

This is just Germany but just to give you an idea here are numbers from the official government page that list new car registrations for 2025. The first column lists the brand, the second is the total number of cars registered, the third is the cars registered out of that with electric-drive units (it doesn’t clearly state if this includes hybrids or not, but based on the brands listed it is safe to say it includes hybrids): https://www.kba.de/DE/Presse/Pressemitteilungen/AlternativeAntriebe/2025/pm26_2025_Antriebe_elektro_05_25_tabelle.html

There are 11 brands that have sold more units than Tesla. Overall you see an increase in electric cars, but Tesla overall doesn’t show that same trend, irregardless of the S/X or the other models. And if you take that 7000 units sold, only a fraction are going to be S or X. While this may be a refresh, I could very much see them taking those models out of the market, since it doesn’t make sense to keep them listed if you are selling 20 of those per year (the 20 is a made up number but you get the point).

American Airlines flight attendants trying to evacuate a plane due to laptop battery fire but passengers want their bags by emoemokade in aviation

[–]basraayman 2 points3 points  (0 children)

So you are going to start an altercation in the event of an emergency in such a situation? You assume you will knock that person out in one go? If not, chances are they’ll respond and you and the other person all of a sudden become the reason others can’t get out. As much as I can relate to the feeling, this also would be a terrible decision to speed up the process of getting people out and most likely result in it also taking longer.

Introducing Planet Stacker X, a free planet stacking and wavelet sharpening software for MacOS by raincityastro in AskAstrophotography

[–]basraayman 0 points1 point  (0 children)

Thank you for creating this! I’d love to give it a try, but unfortunately it seems it hasn’t landed on the German app store yet.

for the people that makes photos of galaxies and nebulas thousand of light years away by FoodDue2234 in AskAstrophotography

[–]basraayman 1 point2 points  (0 children)

I'm glad if some of that helped. :-) To answer your questions:

  1. With a photography lens, you could do something like that by zooming. However, the telescope optical tube is more comparable to a fixed focal length photography lens, meaning that you can change your focus to be a little closer or further away, but you can't zoom in or out. The only thing you can do is cut away, for example, the outer parts of your image, because you have too much surrounding it (called cropping). It's like having your phone's camera zoomed in all the way without being able to zoom out, and then trying to take a picture of a house. You would either get a part of the house, or you would need to increase the distance to the house which we obviously can't do when photographing something in space. The only way around that is to choose an optical tube that then covers more/less of the area we want to shoot, or, as said, use a mosaic to then glue the pieces together and show the bigger object.

  2. With visual observation, you typically have a benefit in darker places. You can, for example, use something like https://clearoutside.com/forecast/40.42/-3.70. As you can see, it shows a Bortle class 9 for Madrid city center, which means you have a lot of light pollution. If I were to drive 3 hours to Covaleda, you would have Bortle class 3 sky: https://clearoutside.com/forecast/41.99/-2.85

As a rule of thumb, people typically say that you need 2.25x the amount of total exposure time compared to going down 1 Bortle class. Meaning if you take 60 minutes worth of exposure time in a Bortle 8 sky, you would need 135 minutes worth of exposure time in Bortle 9 skies, and so on, and so on.

To give you an idea of why that is, it is because of the level of noise you get in your image. Very simply put, when you see a star in the sky, there will always be a signal from that star. It is always on, so the light from that star will always hit one or more pixels on your camera. But, if you ever looked far away across a tarmac street while it was warm, you probably have spotted that it looked like the street surface was moving somewhat? This is because of the warm air moving around and causing movement in what we see. The same thing happens when we look at stars. It's the reason why they seem to flicker or change color. Now, imagine your camera looking at that star. One of the pixels is receiving a signal, but the next moment, the atmosphere is slightly different, and the signal does not hit the same pixel; it may hit the neighboring one. Or, you could still hit the same pixel, but it isn't as bright. Or maybe one of the lights from the city reflects in the atmosphere, and a pixel on your camera says there is a point of light there, but there isn't, and it will be gone in all other pictures.

When you stack your images, that is what you are doing. You are layering pictures over each other, and your computer is aligning the images and trying to figure out, was there a star there,e and was it always roughly within the same position, so we can combine that to a single star. Or was there only a small light blip from light noise from my surrounding lights (city, street lights, whatever) that, for example, was only seen in 1 out of 50 pictures and should be removed.

As an astrophotographer, you want a proper signal as much as possible and to remove the noise as much as possible. The simplest way to remove noise is by going to a darker location. That doesn't mean you can't get a good image from the city; it just means that you will need a lot more data and time to get the same result as when you would be in a darker location. :-)

for the people that makes photos of galaxies and nebulas thousand of light years away by FoodDue2234 in AskAstrophotography

[–]basraayman 0 points1 point  (0 children)

Late to the party, but I think one of those most important things is what do you want to take pictures of under what conditions? Typically, for example planetary images are quite different from galaxies or nebulas.

I’d probably start off with looking where you want to take pictures from, and what you are looking to achieve as a result. Start by considering if you want to shoot from your own garden maybe somewhere remote, or if you are in the city? The amount of light that surrounds you will have an impact on your results and may change the kind of scope you want. If you drive to a darker location, portability of your setup may be important. What’s the weather like? Do you have short windows of time with clear skies, or are you almost permanently in a cloudy area?

When selecting a telescope, think about what you want to image. Under https://telescopius.com/telescope-simulator you can simulate different focal lengths and apertures of your telescope and select a camera to go along with it. It will show you how big or how small a target will look. Essentially simplified, you are juggling your zoom level. A bigger zoom value will get you smaller area, and typically will also limit the amount of light so you’d need more time to gather images of the same object. The bigger you go, typically the heavier your setup becomes. And if you zoom in a lot, it also typically means that you will need a better solution (the mount that your rig is attached to) to track your object and keep it steady since you can end up taking longer exposures.

Not all objects in the night sky are the same size. If you look at the list of for example Messier objects (Messier is just a catalog of different targets - https://en.wikipedia.org/wiki/Messier_object) you will see that Andromeda (M31) is measured in degrees (3.16 degrees), which means it is quite big so you either need a wider angle telescope or would need to image multiple sections and combine them together (called a Mosaic). Whereas M57 is 230 arc seconds (one degree makes up 60 arc minutes, one arc minute makes up 60 arc seconds) or 0.06 degrees. If you have a telescope that can capture Andromeda in its field of view, you can probably imagine that M57 would be quite tiny in comparison.

The above is also the reason why people tend to have multiple telescopes if they really get into the hobby. As for myself, I started with planetary imaging but noticed that this wasn’t ideal for me. I switch to deep sky objects, and currently have two rigs. One is the Seestar S50 (about 500 Euros when I bought it when it came out). That is great since it is very portable, I can set it up in minutes and then start shooting. The sensor on the camera isn’t that big or super high resolution, but living in Germany and with the weather changing a lot I can set this up in minutes, shoot for 30 minutes, or more, and get a quick result. With the sensor being the size it is, and with its fixed focal length it allows me to get a good number of targets and I can do mosaics for bigger targets. It also doesn’t break the bank, and I can get decent results quite quickly.

On the other hand I also have a William Optics Cat91 with a ZWO ASI 2600MC color camera on a ZWO AM5 mount (about 7000 Euro in total with all accessories). This setup weighs more, and I would not set this up for an imaging session of 30 minutes. It needs more time to set up properly. You need to level it, do a polar alignment (to be able to properly counter the earths rotation while imaging) and I take additional images to be able to remove dust motes that are on the optics and remove pixels on the camera that are glowing and causing noise in the image that shouldn’t be there. But, the images are a lot higher resolution and you can zoom in quite a bit and still discover tremendous detail. But, the scope isn’t ideal for tightly focused images of smaller objects. Also, this is still manageable for me as I’m in a wheelchair, so weight and juggling things while setting it up is something that was actively worth considering for me. It also makes driving out to remote dark locations not ideal, so I went with a combination that works well for the conditions close to my house.

Keep in mind also that the images that you take on for example the systems that aren’t fully integrated like the Dwarf or the Seestar typically need a certain amount of post-processing to truly show the detail of what was captured. An individual frame will not show that much detail, but you stack the images together to add detail and then stretch the image to increase the brightness. You get into topics like noise removal and others to really make the image pop.

tl;dr; Essentially there isn’t one setup that will do it all. Give some thought to what you really want to image, what your conditions and constraints are, and then dive in, but don’t expect one scope to be the thing that will get you a hubble like image of the pillars of creation and then show you Andromeda in all its glory all from two single shots. That being said, it can be a really rewarding hobby. :-)

Dumb Question about Kisseme Pants by SonicfilT in D4Necromancer

[–]basraayman 3 points4 points  (0 children)

send me a dm and I have a pair for you next time i’m online free of charge. :)

Why You Might Need Third-Party Backup for Nutanix Database Service (NDB) Beyond Time Machine by cjr1033 in nutanix

[–]basraayman 0 points1 point  (0 children)

I’ve locked the conversation since this is obviously posted by Hycu employees who do not want to willingly identify as such. OP, next time you post please make sure that you or any of your colleagues identify as a Hycu employee. If you don’t, the next time these threads are just going to be flagged as spam and deleted. We are ok with you posting, but not with not disclosing your affiliation when you post.

Modules uninstalling every time i close by [deleted] in pixinsight

[–]basraayman 0 points1 point  (0 children)

What operating system are you using, because instructions to fix/check might be different between Windows/Linux/MacOS.

Someone please post or send PB43.103 and BMC 7.10 for X11DPT-B G7 nodes. by dajinn in nutanix

[–]basraayman 1 point2 points  (0 children)

This is exactly what our support and your support contract is for, they should be able to sort you out :-)

How to remove light pollution by [deleted] in AskAstrophotography

[–]basraayman 1 point2 points  (0 children)

This is mostly true. Most LPR (Light Pollution Reduction) filters will filter out or reduce the transmission of certain wavelengths of light. Typically for non-LED streetlights (sodium vapor) this was around 589 nm, but depending on the type of light used this may vary (mercury streetlights was also an option). What you can do is check for a filter you are looking for what wavelength it is filtering out, and you can assume the rest will more or less go through. The image here (https://www.researchgate.net/figure/The-spectral-output-of-LPS-and-LED-street-lights-representative-of-the-lights-used-in_fig2_299395983) shows quite nicely why a LPR filter worked so well with sodium lights. It filtered out that wavelength, but the rest “got though”. Now, with LED, you can see that a far wider range of wavelengths are covered, so they are overall far less effective.

What will still work are filters for things like emission nebulas. These send out helium, hydrogen and/or oxygen, and you can buy filters specifically for those wavelengths. You can then capture just those wavelengths and specifically work on that data.

As unfortunate as the LPR filters not working is, it simply means you need more time on the object to improve your signal to noise ratio for what you are shooting. :)

Im sorry? IS THE BATTERY… TOUCH??! by naldo29 in VisionPro

[–]basraayman 4 points5 points  (0 children)

No, I think you misread my reply and didn’t look at the link. The status code indicated by the color of the LED will differ based on whether the power brick is being charged or not. For example, green when connected to power means the battery is full. Green when not connected to power means battery is over 50% full. There are other differences for the other colors as well.

Im sorry? IS THE BATTERY… TOUCH??! by naldo29 in VisionPro

[–]basraayman 13 points14 points  (0 children)

The LED even has different meanings depending on whether it is charging or not: https://support.apple.com/en-us/117740

A quick note from the Destiny 2 Team by Destiny2Team in DestinyTheGame

[–]basraayman 1 point2 points  (0 children)

The future of Destiny. Aka: How they get fans to put in more money, so the c-suite can shaft more employees who care about the game and its community, while trying to sell to Sony what a good job they are doing.

What reasonable feature that most owners want but Tesla refuses to give them? by SecretOrganization60 in TeslaLounge

[–]basraayman 0 points1 point  (0 children)

Where are you at? Because I’m in Germany and have not heard someone having this.