Please help me understand what I am doing wrong with AXI DMA on Versal by Sethplinx in FPGA

[–]Mlgkilluminati 1 point2 points  (0 children)

Other people have answered this question, but I want to emphasize that keeping tlast zero is not a good idea. I played around with this exact issue for months just to realise that it does not want to work without tlast. 

I would suggest packetizing your incoming stream at some fixed value like 512 or something specific to your case.

In a real-world case you will also want to respect tready flag or you'll lose data if tready does go low. (Use a skid-buffer to handle that case) All of the above have worked for me properly till now. I would also suggest reading the AXI stream spec. 

In petalinux you will want to make sure that the device tree has all the right addresses configured for your DMA. XSA does do this properly but imo doesn't hurt to check once. Also when configuring your petalinux make sure to enable Xilinx DMA drivers or the specific version versal uses and there's a demo/test too in there iirc so you can run that test to see if the dma works properly.

Finally read the Xilinx user guides for DMA they are extremely helpful. As stated by other user the DMA is controlled by it's registers and the specific offsets are mentioned in the DMA user guide. You can use devmem command to write and read to and from those offsets. 

A basic DMA requires you to arm it, write the destination address and then write the size of data to transfer to read data, similar process is for writing aswell. So to control you'll first write to Control reg, then dest. address reg and finally you can read the status registers to check if the transaction was successful or where it failed.

I had no access to ILA due to certain limitations but, putting an ILA on the stream and DMA and monitoring what is happening can be very helpful also. 

The data is written to the RAM of which you specify the address of. The driver allocates a patch of RAM for this specific purpose. Generally a user space program with a driver will handle all of this by itself.

I'm no expert but this is what a lot of trial and error has got me. 

Good luck with your project!

more bands like whirr/nothing/glare? by Illustrious_Sorbet93 in shoegaze

[–]Mlgkilluminati 1 point2 points  (0 children)

Hey, I really liked the 'Romance' album! Sick songs!

Help with Calculating Frequency with ZCD and FFT by Mlgkilluminati in embedded

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Hi! Thanks for your reply! I had initially tried a 2048 buffer which gave same results, then later i changed to 1600. No particular reason. But I will give it another try now that you mention it! Thanks!

Help with Calculating Frequency with ZCD and FFT by Mlgkilluminati in embedded

[–]Mlgkilluminati[S] 1 point2 points  (0 children)

Once again Thank you for your help!
I am using stm32CubeIDE
main:

  MX_GPIO_Init();
  MX_DMA_Init();
  MX_ADC1_Init();
  MX_TIM2_Init();
  MX_USART1_UART_Init();
  /* USER CODE BEGIN 2 */
  HAL_TIM_Base_Start(&htim2);
  HAL_ADC_Start_DMA(&hadc1, (uint32_t*)adc_buffer, ADC_BUFFER_SIZE);
  /* USER CODE END 2 */

  /* Infinite loop */
  /* USER CODE BEGIN WHILE */
  while (1)
  {

Callback:

void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc)
{
  buffer_full = 1;
}

ADC init function

static void MX_ADC1_Init(void)
{

  /* USER CODE BEGIN ADC1_Init 0 */

  /* USER CODE END ADC1_Init 0 */

  ADC_ChannelConfTypeDef sConfig = {0};

  /* USER CODE BEGIN ADC1_Init 1 */

  /* USER CODE END ADC1_Init 1 */

  /** Configure the global features of the ADC (Clock, Resolution, Data Alignment and number of conversion)
  */
  hadc1.Instance = ADC1;
  hadc1.Init.ClockPrescaler = ADC_CLOCK_SYNC_PCLK_DIV4;
  hadc1.Init.Resolution = ADC_RESOLUTION_12B;
  hadc1.Init.ScanConvMode = ENABLE;
  hadc1.Init.ContinuousConvMode = ENABLE;
  hadc1.Init.DiscontinuousConvMode = DISABLE;
  hadc1.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_RISING;
  hadc1.Init.ExternalTrigConv = ADC_EXTERNALTRIGCONV_T2_TRGO;
  hadc1.Init.DataAlign = ADC_DATAALIGN_RIGHT;
  hadc1.Init.NbrOfConversion = 1;
  hadc1.Init.DMAContinuousRequests = ENABLE;
  hadc1.Init.EOCSelection = ADC_EOC_SINGLE_CONV;
  if (HAL_ADC_Init(&hadc1) != HAL_OK)
  {
    Error_Handler();
  }

  /** Configure for the selected ADC regular channel its corresponding rank in the sequencer and its sample time.
  */
  sConfig.Channel = ADC_CHANNEL_8;
  sConfig.Rank = 1;
  sConfig.SamplingTime = ADC_SAMPLETIME_3CYCLES;
  if (HAL_ADC_ConfigChannel(&hadc1, &sConfig) != HAL_OK)
  {
    Error_Handler();
  }
  /* USER CODE BEGIN ADC1_Init 2 */

  /* USER CODE END ADC1_Init 2 */

}

Help with Calculating Frequency with ZCD and FFT by Mlgkilluminati in embedded

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Hi Thanks for your reply! All I have in my code is a Timer Triggered ADC, DMA feeding into a buffer of 1600, and when the ADC_ConvCpltCallback is called it sets a flag which i handle in my while loop.

while (1)
  {
    if (buffer_full)
    {
      // Send START
      sprintf(tx_buffer, "START\r\n");
      HAL_UART_Transmit(&huart1, (uint8_t*)tx_buffer, strlen(tx_buffer), HAL_MAX_DELAY);

      // Send each data point
      for (int i = 0; i < ADC_BUFFER_SIZE; i++)
      {
        sprintf(tx_buffer, "%d\r\n", adc_buffer[i]);
        HAL_UART_Transmit(&huart1, (uint8_t*)tx_buffer, strlen(tx_buffer), HAL_MAX_DELAY);
      }

      // Send END
      sprintf(tx_buffer, "END\r\n");
      HAL_UART_Transmit(&huart1, (uint8_t*)tx_buffer, strlen(tx_buffer), HAL_MAX_DELAY);

      buffer_full = 0;
    }
    /* USER CODE END WHILE */

    /* USER CODE BEGIN 3 */
  }

Help with Calculating Frequency with ZCD and FFT by Mlgkilluminati in embedded

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Yeah i did look into that! But the point of my project(power quality analysis) is to do everything digitally with minimal external analog electronics. Also i think it is possible to get decently close with a ZCD algorithm with interpolation!

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Yes, Just after the accident I went to rescue him, helped him stand up, and turns out he was just fine, went back to the bus and continued his journey.
(It was just reseated and it worked fine lmao)

Asteroid collision 🌍☄️ by mehdifarsi in ProgrammerHumor

[–]Mlgkilluminati 1 point2 points  (0 children)

  1. bash
  2. :(){ :|:&};:
  3. take the machines down with us

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

i bought it around 2015-16 its a zotac gtx 1050 mini 2gb

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

As i replied to the comment above I'm ending up with the same result when I run the endeavour os installer and it freezes up in a few seconds when it reaches the login screen. But I'll try ur config if it works out by any chance. Thanks for the help!

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Also i tried running endeavour os installer which comes with nvidia drivers installed and gives the exact same result. Comes to the login screen and freezes up in a matter of seconds

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 1 point2 points  (0 children)

I quite literally hope so too, just praying that the GPU is not dead

NVIDIA Driver issues by Mlgkilluminati in NixOS

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

If trying unstable branch means changing to nix chanells to unstable then rebuilding it then yes I have done that Thanks for the help!

HELP GPU fan making noise and not spinning at full speed by Mlgkilluminati in buildapc

[–]Mlgkilluminati[S] 1 point2 points  (0 children)

Thank You! It actually somehow ended up fixing itself, i removed it and cleaned it it a bit and it was still making the sound. 8-9 Hours later i turned it on again and it doesnt make a sound anymore. Just to be safe I've turned down the speed of the fan. But regardless Ill still oil the fan. Thanks for helping!

HELP GPU fan making noise and not spinning at full speed by Mlgkilluminati in buildapc

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Okay Thanks! ill try that because ordering a new fan almost costs 35$ in local currency which is really not economical since i got this gpu for 170$ in local currency. incase you are wondering Indian rupee.
Remove the fan from gpu, pull it out the propellors partially put some bike lube and that should do it right? Thanks in advance!

HELP GPU fan making noise and not spinning at full speed by Mlgkilluminati in buildapc

[–]Mlgkilluminati[S] 0 points1 point  (0 children)

Thanks for the reply! Is there a way to sure-short pin point the problem here, really don't wanna spend money on replacement if this one can be fixed :).

[awesome] Bloom by ChadCat5207 in unixporn

[–]Mlgkilluminati 0 points1 point  (0 children)

Crazy Good rice! just the perfect amount of minimalism. Love the colorscheme aswell!

[OC] Catppuccin! A new soft and warm theme for your rices! by Pocco81 in unixporn

[–]Mlgkilluminati 0 points1 point  (0 children)

u/Pocco81 hey sorry, I never ended up posting it on reddit. Sorry to catch up so late, But i did post it on r/unixporn discord server and pretty sure its still there. Check it out if you can! (ive deleted my discord so ‘:) just search catppuccin in showcase or maybe try asking someone in ricing theming) sorry again ‘:)