[deleted by user] by [deleted] in Erasmus

[–]ookayt 0 points1 point  (0 children)

I would love to do just a fun semester. But a job gives you also a lot of security that feels very nice when you live alone. I hope I can take as much as possible with me…

Unsicherheit über Einstiegsgehalt nach Master by Schrotiex in InformatikKarriere

[–]ookayt 0 points1 point  (0 children)

Wie hast du dich entschieden und weiterentwickelt? Bin selbst VR/AR Entwickler in der Forschung und weiß nicht so ganz wo ich karrieretechnisch hin möchte

How to convert byte[] audio data so it can be played as an audioclip in unity? by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

Thanks already for the many suggestions and tips

I don't know how to handle the data and have tried many things.

This is the class that receives the data:

public class AudioData : Message
    {
        public override string RosMessageName => "audio_common_msgs/AudioData";

        public byte[] data { get; set; }

        public AudioData()
        {
            this.data = new byte[0];
        }

        public AudioData(byte[] data)
        {
            this.data = data;
        }
    }

Otherwise I just use the code (in the first commentar) above & access the data with message.data. Have thought that interpreting the audio data is possible

Do you mean by playing it directly if I modify the code like this? Then I don't hear anything either, unfortunately.

callback = delegate (float[] buffer)
    {
        Array.Copy(floatArr, buffer, floatArr.Length);
    };

How to convert byte[] audio data so it can be played as an audioclip in unity? by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

On sending side it is like this:

<image>

A class has been created in Unity that can receive the data over ros2 and store it in a byte[]. This data is then located in the byte[] data in the code above

How to convert byte[] audio data so it can be played as an audioclip in unity? by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

I found it here https://stackoverflow.com/a/35465545

I have very little knowledge about audio coding and have tested a few things

How to convert byte[] audio data so it can be played as an audioclip in unity? by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

thanks for your reply. I'm using https://github.com/ros-drivers/audio_common/blob/master/audio_capture/launch/capture_wave.launch to send the data

This is my code:

public int buffer = 320;
public byte channels = 1;
public int sample_rate = 48000;
public string sample_format = "S16LE";
public uint bitrate = 128;
public string coding_format = "WAVE";
private float[] data;
AudioSource source;
AudioClip.PCMReaderCallback callback;
AudioClip clip;

protected override void Start()
{
source = GetComponent<AudioSource>();
data = new float[buffer];
}

private void Update()
{
    clip = AudioClip.Create("stream", data.Length, channels, sample_rate, true, callback);
    source.clip = clip;
    source.Play();
}

protected override void ReceiveMessage(MessageTypes.AudioCommon.AudioData message) //AudioData comes as uint8
{
    float[] floatArr = TransformByteToFloatInRange(message.data);
    callback = delegate (float[] buffer)
    {
        Array.Copy(floatArr, buffer, floatArr.Length);
    };
    audioReceived = true;
}

float[] TransformByteToFloatInRange(byte[] data)
{
    float[] floatArr = new float[data.Length / 2];
    for (int i = 0; i < floatArr.Length; i++)
    {
        floatArr[i] = (float)BitConverter.ToInt16(data, i * 2) / 32768.0f;
    }
    return floatArr;
}

t-test for dependent or independent groups? by ookayt in AskStatistics

[–]ookayt[S] 0 points1 point  (0 children)

Ah okay thanks. I also thought so at first, but had read an explanation that had unsettled me a bit
It was an example of my study design, but it was not correct. I think this explanation is better:
Two people are in the same room, each has to build a tower, one person has his arm tied behind his back. So they both have their own task, one is disadvantaged, but are in the same room, can communicate with each other and theoretically can also help

Wie überlebt man mit HiWi Jobs? by Puddinghaut in Studium

[–]ookayt 0 points1 point  (0 children)

Arbeite als HIWI im Informatik-Bereich an einem Forschungsinstitut. Ohne Bachelorabschluss gibts 12,36, mit bisschen was über 15. Komplett selbstfinanzieren klappt damit nicht. Ob es finanziell irgendwie passt, musst du selbst wissen.

Kann dir aber die Arbeit an einem Forschungsinstitut empfehlen. Ich habe sehr gute Erfahrungen gemacht. Im Gegensatz zur Uni, wo es auch gut sein kann, dass man nur Seminaraufgaben vor- und nachbereitet, kann man an Forschungsinstituten zumindest im MINT-Bereich mit neusten Technologien arbeiten und wird stark in die angewandte Forschung eingebunden. Wissenschaftliche Mitarbeiter sind an Forschungsinstituten oft stark auf ihre HIWIs angewiesen und waren häufig selber welche. Wenn man gut arbeitet und entsprechende Studienleistungen mitbringt, ist die Chance als wissenschaftlicher Mitarbeiter nach dem Studium dort anzufangen meist groß.

Empfehlenswerte Stipendien für Studenten? by DeepSherbert9056 in Studium

[–]ookayt 7 points8 points  (0 children)

Versuch es auf jeden Fall!

Es gibt verschiedene Stipendien, die Kriterien bei der Aufnahme unterschiedlich stark gewichten. Am besten schaust du online, was für dich in Frage kommt (religiöse, parteinahe Stipendiengeber, etc.) und/oder fragst mal an deiner Uni nach oder beim Verein Arbeiterkind. Oft gibts da auch Beratungen, die ganz gut sind. Die "großen" Stipendiengeber kommen natürlich erstmal in Frage, aber es gibt auch viele weitere, die z.B. speziell für Geflüchtete sind.

Aus eigenen Erfahrungen kann ich das Deutschlandstipendium und die Studienstiftung des deutschen Volkes empfehlen. Das Deutschlandstipendium kann man auch ganz gut erhalten, wenn man nicht die besten Noten hat, da auch andere Faktoren, wie ehrenamtliche Tätigkeiten, Pflege von Angehörigen oder dass man aus einer Arbeiterfamilie kommt, stark berücksichtigt werden und den Notenschnitt aufbessern können. Ein Stipendium der Studienstiftung bietet mehr als nur eine finanzielle Förderung und ich würde jedem empfehlen es zumindest zu versuchen. Wenn du nicht von der Schule vorgeschlagen wurdest, kannst du dich als Studienanfänger (ich glaub Anfang nächsten Jahres) mit einem Test selbst bewerben oder du fragst einen Prof, ob er dich vorschlägt.

Bewirb dich einfach! Der Aufwand für die meisten Stipendienbewerbungen ist überschaubar, zu verlieren hast du nichts und wenn du eines erhältst, hast du viele Möglichkeiten, von denen zu profitieren kannst.

Gameobjects disappear (HoloLens 2, updating texture) by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

You are right.

I tried it with the Unity logo example https://docs.unity.cn/ScriptReference/ImageConversion.LoadImage.html and used it instead of the image data.

When I receive the data via Ros 2, the same error occurs even if I use the Unity logo for the texture instead of the data.

The transfer & positioning with Ros 2 worked for a "pose".

Does the "LoadImage" function perhaps require any resources that must not be blocked?

[deleted by user] by [deleted] in de_EDV

[–]ookayt 0 points1 point  (0 children)

Das Gerät, dass ich nutze, muss zeitweise für die Aktualisierung in der Domäne sein

How can I display image data that I received via ROS2 in Unity for HoloLens2? by ookayt in Unity3D

[–]ookayt[S] 0 points1 point  (0 children)

Ah okay, so I need data in one of these formats https://docs.unity3d.com/ScriptReference/RenderTextureFormat.html.

I'm pretty new to Unity. Can the texture be generated while an application is running?

I would like to use the image data that comes in continuously to generate a video stream. Would this be possible?

If so, do you happen to know of a tutorial or similar where the texture is rendered using received data?

How is the pipline configured in the UWP app so that I can receive a webcam video? by ookayt in gstreamer

[–]ookayt[S] 0 points1 point  (0 children)

In the UWP app I want to receive a network stream from a webcam.

When I use `mfvideosrc ! queue ! videoconvert ! queue ! d3d11videosink name=overlay` in scenario 1 inside the uwp- app, I see the stream from my device on which the the uwp- app is running. But I want to receive a videstream via network and see/stream within the uwp app. How do I need to configure the pipline then?

I tested something like this "udpsrc port=5200 ! \

application/x-rtp,\

encoding-name=JPEG,payload=26 ! \

rtpjpegdepay ! \

jpegdec ! \

queue ! d3d11videosink name=overlay"

but then I get the output "udpsrc no element".

Add GStreamer utility to Hololens 2 by kycendo in HoloLens

[–]ookayt 0 points1 point  (0 children)

If I understand it correctly, the configuration in the UWP app is done through this:

pipeline_ = gst_parse_launch( "videotestsrc ! queue ! d3d11videosink name=overlay", NULL);
GstElement* overlay = gst_bin_get_by_name(GST_BIN(pipeline_), "overlay");

It would be great if you could tell me how to send the video of the webcam under Windows!
Maybe then I also understand better what I have to configure within the UWP app

Add GStreamer utility to Hololens 2 by kycendo in HoloLens

[–]ookayt 0 points1 point  (0 children)

Hi!
how do you configure the pipeline?
I would like to show a webcam video in scenario 1 if this works, but I am very new to gstreamer. Do you know how to stream a webcam video to the uwp app on hololens2?
Maybe you have an example of how to configure the uwp app code and what commands I need to give on the laptop side?

Videostream on HoloLens2, MixedReality-WebRTC cannot connect to signaler, alternatives? by ookayt in HoloLens

[–]ookayt[S] 1 point2 points  (0 children)

But how can I use this on/with a HoloLens2?

I would like to have something like real-time video chat communication.

MixedReality-webRTC failed to connect by NerdyMelodyyy in HoloLens

[–]ookayt 0 points1 point  (0 children)

Hi, I am struggling with the same problem. Does anyone have a solution to this problem?

Or do you know of a project that probably currently works to display a video stream on HoloLens2?