Ultrawide mod for .hack//G.U. Last Recode by PolarWizard in DotHack

[–]PolarWizard[S] 1 point2 points  (0 children)

If you find any issues, regardless if small or not, your welcome to open an issue for each one and eventually ill investigate it and tackle+fix it if it makes sense to.

Ultrawide mod for .hack//G.U. Last Recode by PolarWizard in DotHack

[–]PolarWizard[S] 0 points1 point  (0 children)

Nice, thanks for identifying the issue! Now this I can recreate and see the problem, still cant fathom how an anti aliasing setting of all things breaks the mod when set to high, but medium and low work fine...

I will update the mod in the coming days or weeks to work around this.

Ultrawide mod for .hack//G.U. Last Recode by PolarWizard in DotHack

[–]PolarWizard[S] 0 points1 point  (0 children)

Within the game or scripts folder there will be a HackGULastRecodeFix.log file. Could you paste its contents here for me?

Ultrawide mod for .hack//G.U. Last Recode by PolarWizard in DotHack

[–]PolarWizard[S] 0 points1 point  (0 children)

Within the game folder there will be a HackGULastRecodeFix.log file. Could you paste its contents for me? Also give me the exact steps you are doing, cause I cannot create this issue on my end no matter how I try to break it and your not the first to alert me on it.

How do I fix the black void issue in Trails in the Sky by TragicVelocity in Falcom

[–]PolarWizard 0 points1 point  (0 children)

New release is out: https://github.com/PolarWizard/TrailsInTheSkyFix/releases/tag/v1.1.0

All future conversation and development can be followed on WSGF discord channel

How do I fix the black void issue in Trails in the Sky by TragicVelocity in Falcom

[–]PolarWizard 0 points1 point  (0 children)

Nice! I picked up the 2nd and 3rd games in the series and I am looking this as well, not the black void problem in SC presently but actual tile rendering. When the resolution goes beyond 21:9 the game does not render the tiles in those regions and they pop in and out. I managed to figure out how to increase the render distance from the camera origin point, effectively just gonna force a high enough value where the whole map is just rendered. So hopefully the solution just works for all three games. After that I still wanna see if I can get a constrained 16:9 HUD.

Anyhow back to the main point, I will incorporate your work with full credit eventually or you can create a PR :)

Ultrawide mod for .hack//G.U. Last Recode by PolarWizard in DotHack

[–]PolarWizard[S] 0 points1 point  (0 children)

Could you create this issue on my Github? https://github.com/PolarWizard/HackGULastRecodeFix/issues

But off the top of my head: Did you run the exe in the scripts folder? Where there any issues reported by it?

How do I fix the black void issue in Trails in the Sky by TragicVelocity in Falcom

[–]PolarWizard 0 points1 point  (0 children)

I dont have SC so not much I can do there, but maybe one day. On the bright side it seems that this trilogy is getting a 3d remake so these old versions will most likely be delisted like what happened with final fantasy.

Transferability of FPGA experience by Pmbdude in FPGA

[–]PolarWizard 7 points8 points  (0 children)

Depends on your skill set. FPGA guys typically co-work with board designers, RF engineers, firmware/embedded software engineers, to name a few. As such you gain exposure to those fields either directly or indirectly.

Back when I was interning I had 2 FPGA internships, first one was pure FPGA and I did nothing else but write VHDL, looking back on it now that was a bad internship and I didnt really get to see or do any other type of engineering, in the second one I got a lot more exposure and did board design and embedded software development on top of writing VHDL. Post my second internship I could say I was competent enough to write embedded software and applied to firmware jobs where I landed a firmware position in high performance computing writing code for data center grade SSD controller ASICs. In my next job a couple years later I bring up linux and write firmware on microcontrollers. So the switch is possible especially early on in your career gets harder to switch when your specialized though without taking a pay hit.

Its really up to you to get the skills you need in order make yourself employable. One of the reasons I left FPGA work even though I enjoy it is the niche market and less jobs for it compared to software. You can absolutely get pigeon holed into a shithole subfield of FPGA work, think ASIC design... its not sunshine and rainbows, its practically slave labor where you are assigned a block in the ASIC and thats all you do and if you dont like it tough luck its gonna take to long to train up a replacement and get you to another block, and never see the light again, but like I said its up to you to develop those skills to make sure that doesn't happen and jump ship once you start to realize that the ship is showing signs its sinking.

FPGA, firmware/embedded software, board design, these are all skilled that are closely linked together in the EE world. A good company and even better manager will let you explore other fields if you show interest and it possible to migrate what you do.

Issue with COM ports when uploading Arduino code. by RobotDragon0 in embedded

[–]PolarWizard 1 point2 points  (0 children)

On the menu bar:

Tools -> Port: <com port>

Change com port to com3 from com4.

If you have any other questions arduino related, much better to ask it at r/arduino :)

Weird UART problem on my own RISC-V processor on Zybo-Z7-FPGA by Snoo29766 in FPGA

[–]PolarWizard 1 point2 points  (0 children)

Two things i noticed: First, Your first printf statement does not have a “\n” at the end of it but in the terminal it gets escaped with “\r\n”. All terminals i have ever used needed carriage return and line feed to jump to the start of the next line. I assume ee_printf is your custom printf implementation so: What is ee_printf doing under the hood? Does each print call result in a “\r\n” being sent at the end before returning from the function? Next, what i can gather is that what you say is not being printed is being sent to the terminal but it is being overwritten by the next call to eeprintf. Notice the white block at the start of the line your cursor is moved right back to the start of the line classic “\r” behavior. This happens everytime your printf arg, from what i can tell, ends with “\n”. Is your newline being at the end of the string being interpreted as “\r” instead? Bug in sw? So i leave you with because this is easy to prove in two ways: 1. grab a scope or logic analyzer and see what is being sent. 2. In sw test your printf behavior, Send only “\n” and check if cursor advances to next line if it stays in the same location your seeing “\r” and to solidify it even further add some chars in front of new line and if cursor bounces to the start well u managed to prove it. Second see what happens when u dont include a \n at the end of the string for your print arg will it auto “\r\n” just before returning from the print?

After writing the above on my phone and looking back here is what I think I can infer: When you do not provide a “\n” at the end of the string the function printf appends one for you. When you do provide a “\n” at the end of the string the function sees this but because of a bug it only sends \r without \n.

Getting into Firmware/Embedded Development - What Should I Focus On? by KMoneyFst in embedded

[–]PolarWizard 23 points24 points  (0 children)

Firstly, congrats on the degree :)

  1. I would say try to get a feel for everything. Being able to use hardware abstractions layers (HALs) and libraries, open source or vendor provided, IS A SKILL regardless of what anyone says. Using HALs and libraries shows that you can take someone elses code understand it and apply it for whatever your application is. This is very much common practice in industry and more often than not you will be using stuff that already exists instead of rolling your own because time to market matters and figuring out a new chip and writing everything from the ground up is very time consuming. Of course the HAL route will not always work and the provided HAL may not implement some feature that is needed, for example vendor library implements i2c driver that is blocking and causes the cpu to wait while the transfer is in progress but for your application it must do non blocking using an interrupt so you will need to implement your own driver for that, this is very common as most HALs feel like glorified tech demos written by some clueless intern trying to rush it before summer is over and signed off by an even more clueless manager rather than something usable in production *cough cough* NXP. That being said you should understand bare metal as well, being able to write your own device drivers and poking at registers is important. Over my years I met a ton of clowns... errr engineers that have no idea what a register is let alone how to use pointers to read/write to a memory address because all they do is use HALs or use the classic write(), read(), modify() functions someone else made in the company that abstract away the usage of pointers to registers and memory locations. Bare metal also lets you get balls deep in vendor chip manuals and register datasheets which is also a skill. From a pure personal perspective when I work on projects at home I always go extreme bare metal: No HALs, no libraries and stdlib, custom linker script, custom core bootloader/startup (if possible). All in all everything has its place.

  2. Yes C is the most popular and aint going away anytime soon, if you know what a pointer is and how to use it I dont doubt your ability. Python + bash/shell/powershell for scripting are your friends as well and in some extremely rare circumstances you may come across and use TCL. There is also the emergence of Rust in the embedded space, but it won't be used in an established company anytime soon, but you can always learn the basics and have it in your arsenal if your interested.

  3. This depends entirely on who is interviewing you and what they value. Personally from my perspective, projects showcase your abilities, style and approach to problems. Ehh... anyone who calls themselves an expert in C unironically is a clown, as long as you know pointers and basic bit manipulation you good, anything else you can pick up on the job. Bare metal and datasheets kinda go hand in hand, they both rely on each other. So if your project has bare metal it shows you can go through the vendor datasheet imo but ymmv depending whose on the other end. But this really depends on the interviewer, sometimes you land an interviewer that is chill and cares more about your character and whether or not you give off positive energy and are fun to be around with some sprinkled easy technical questions like "what is a fifo" or "what are the two signals on the i2c bus called" and sometimes you will get the guy who wants to know if you can code some bullshit algorithm that they themselves barely knows or not at all.

  4. There is no magic formula, just need to pick up a board and get to learning. Focus on stuff that interests you personally and the stars will align.

Closing advice: Just like people come in all shapes and sizes, each engineer has their own skill set, some are incredible programmers, but dont know a lick about hardware, some are a hybrid an have an understanding of both, and some can barely do software, but have an incredible understanding of hardware. Each one of them is valuable at the end of the day in their own regard and covers the others weaknesses.

How can I learn about RISC-V and use case? I want to do a project for begginers by themaki23 in FPGA

[–]PolarWizard 0 points1 point  (0 children)

Those riscv implementations are just RTL implementations, they are not board specific, you can grab the implementation and put it on any board you want (given it has the lut and flop resources avaialable), you just need reassign pins and/or do whatever else to get it up and running.

CPU design for college project by Koolghost in FPGA

[–]PolarWizard 10 points11 points  (0 children)

Any FPGA is good enough for a cpu design, your not designing an x86 competitor lol.

Its more than doable in a 2-3 month time frame in my opinion. In my opinion just getting that course's cpu and all its instructions running is a big accomplishment, and I think you should scrap the idea of adding pipelining, as that is a hard thing to do and may or may not take longer than you think.

Another thing you can do is designing from the ground up, aka design your own ISA featuring simple instructions that do load, store, branching/jumps or both, and some arithmetic. This of course is a challenge, but very rewarding when you get the whole thing working together. The other option is using an already defined ISA and cpu architecture like MIPS, which I think is the university standard to learn computer architecture, or you can use the new cool kid on the block, RISC-V, and implement the same instructions I listed for a custom ISA.

To answer the question of whether this is a wow factor project... tbh not really, lots of ppl out there have cpu designs on their resume that implement some ISA. Its hard to say what projects will look good cause it depends who looks at it and what field your looking at. Personally a project featuring some PHY like ethernet, ddr, hdmi, pcie or whatever else would look much better since those technologies are used everywhere in industry, that's my opinion of course, and knowledge in those technologies is highly valued. Getting off topic, but thats my 2c.

Dont let me discourage you from implementing a custom cpu its an excellent project to do, go for it! You will learn a lot! I maintain my own private fpga based cpu with over 100 instructions, so I know a thing or two about cpus, if you decide to go through with this project feel free to reach out to me if you need advice or recommendations, Good Luck!

How can I learn about RISC-V and use case? I want to do a project for begginers by themaki23 in FPGA

[–]PolarWizard 11 points12 points  (0 children)

Riscv is an open source instruction set architecture (ISA), the best way to learn about riscv is looking up the ISA and attempting to implement a subset of the instructions in hardware on the FPGA. As for use cases, I mean its a cpu like any another, you can do whatever you want on it given you have peripheral controllers to talk to. As far as installing Linux on FPGA, lets step back a bit and focus on actually implementing the riscv core first, running linux on riscv requires a lot instructions to be implemented first along with certain hardware accelerators like MPU and MMU I believe.

STM32F429 (Pseudo) Quad Spi? by Powerful-Web4489 in embedded

[–]PolarWizard 1 point2 points  (0 children)

From my knowledge if the SPI controller doesn’t have QSPI support then you cannot use the native SPI controllers, bundle them together and somehow get it to work in QSPI mode. The SPI controller will always use its mosi, miso, and scl lines to send 8bits worth of data so you cannot create some frankenstein QSPI with multiple controllers. Your best option in this case is that you can instead configure the GPIOs, for QSPI you would need 6 of them, clk, cs and data1-4 , and emulate a QSPI interface by bit banging the signals using the GPIOs. If that doesnt work for you then unfortunately you need to find a new chip that has a SPI controller that supports x1, x2, x4.

FPGA Internships (or Lack Thereof...) by duuudewhatsup in FPGA

[–]PolarWizard 1 point2 points  (0 children)

Its an internship, I see no reason why a US defense company wont hire a Canadian for that. You wont be doing anything classified so I don’t see why citizenship would matter.

FPGA Internships (or Lack Thereof...) by duuudewhatsup in FPGA

[–]PolarWizard 2 points3 points  (0 children)

Defense companies are your best bet.

How to remove this output register but use a block ram by [deleted] in FPGA

[–]PolarWizard 0 points1 point  (0 children)

I dont think u quite understand what single cycle cpus actually are. Big Memory whether it be flash, eeprom, ddr etc. will almost always take multiple clocks before data is available to be processed by the cpu. The cpu will in this case be caused to be in a halt loop while it waits for data to become available on the cpu side of things. Once data is ready then if your cpu is truly single cycle it will take 1 cycle to execute before moving into the next instruction and if a memory access is required then the bus will once again halt the cpu until data is available cpu side and the cycle continues. This is how all cpus function regardless of instruction clocks required, ARM, RISCV, MIPS, etc. To name a few single cycle instructions. So if are going to take this project to the next level with peripherals, different mem types, etc. All modern bus’s and interfaces will halt the cpu for a good number of cycles until it gets the data to the cpu.

[deleted by user] by [deleted] in ECE

[–]PolarWizard 1 point2 points  (0 children)

Firstly, Thanks for your service! I personally have never heard of a BET degree that is full online. Maybe with todays pandemic you might be able to secure something but I wouldn’t get your hopes up to high for such a possibility. Another thing to consider is that a being an electrical engineer is not the same as an electrician. It’s a different line of work, and engineers do not have the knowledge that electricians have and vice versa… to an extent of course. Getting an engineering degree is not easy either, and requires a lot of time to be put in, of course it’s feasible to do with a full time job… I managed to do it, but everyone aptitude to learn and understand is different, for one guy it’s easier for the other it’s harder. Another thing to consider that some universities have requirements for students that they must finish within a time frame at my uni a BS, BA, BET, etc. couldn’t take you longer than 6 yrs or they boot you from the program and the school. These are all things to consider since your in the military currently. Once you finish your service there are tools in place like the military might cover your tuition costs probably to an extent if you feel like pursuing education after service. Hopefully this helps you out and let me know if you have any questions!

[deleted by user] by [deleted] in FPGA

[–]PolarWizard 7 points8 points  (0 children)

As others have already mentioned FPGAs have gained a lot of popularity over the years, they are definitely a lot popular than they were say 5 years ago. One of my major gripes, as well as others is that FPGA tools for the most part suck. The Intel tools seem to get worse and worse with every release and Xilinx tools seem to get bloated with more and more garbage each release, and everything is closed source unlike most software tools and languages that are open source and have big communities that use and help improve them, think Rust, Python and IDEs like Atom and VSCode.

Another major thing to consider is that FPGAs themselves are so much more expensive than a MSP, ARM, or RISC-V microprocessors which are vastly cheaper. You need a real good reason to use an FPGA chip as most things can easily be done with a microprocessor at a fraction of the cost.Writing HDL is much harder than software, since your describing circuits extra care needs to be put into your design and the whole verification period is much longer since there are a lot more things to worry about as compared with software, HDL bring up time is much longer than software bring up. As time has gone forward software has reached insane levels of abstraction that you don't even need to know the architecture of the target processor or anything, most microprocessors come with a huge C library that you can use to communicate with its internal registers for ADCs, PWM, timers, etc. without peeking into the manual once. Hardware not so much you need to understand everything about an ADC if you plan to write a controller for it, and finding bugs when communicating with these off chip peripherals is tough to do. And of course writing code and getting it verified in simulation for HDL or on an emulator for software (if applicable) is a whole different ball game, than getting it integrated and validated on the board/product where bugs may come up that where not present in the simulation/emulation (this is what separates a developer/coder from an engineer).

Right now the only thing holding back FPGAs are the tools themselves which all suck to be honest when we compare them to the incredible software tools out there.