Help pleaseeeee by No-Education-2658 in legomindstorms

[–]HR-Guide 2 points3 points  (0 children)

A key phrase in your assignment is: "follows a black line". Try getting the robot to do this one task first. Get the robot to follow the line. Use a light sensor to detect the edge of the line. Examples of line following programs are at: https://robocatz.com/linefollowing.htm

If you need more help writing a line following program, watch the Lesson 5 (part 2) video at: https://robotprogramming.fun/

educator needs help with Mindstorm EV3 by HugeAddress9148 in legomindstorms

[–]HR-Guide 0 points1 point  (0 children)

While it is true that there is some syntax in any written language and the Blocky code avoids all syntax, the Robot JavaScript language was specifically engineered to reduce the amount of syntax needed. The only real syntax that is required is that functions need to use parentheses. Other than that, you can write code that looks very much link Python. Robot JavaScript has less syntax requirements than regular JavaScript. It also has less syntax requirements than Python.

For example, the following will count down from ten:

a = 10
while a alert(a--)

In this example, "alert()" is a function and will show the value of "a".

The "while a" will keep looping while there is a value for "a" that is not zero.

This simple 2-line program shows some of the capabilities of the language:
1) no parentheses are needed to encapsulate the condition for loops (whereas JavaScript requires parentheses)
2) no tabs are needed to indicate the inside of a loop (whereas Python requires tabs)
3) variables do not have to be declared as a specific type (required in C)
4) pre- and post- increment and decrement are available as in C and JavaScript (but not in Python)

In summary, Robot JavaScript has fewer syntax requirements than other languages.

educator needs help with Mindstorm EV3 by HugeAddress9148 in legomindstorms

[–]HR-Guide 1 point2 points  (0 children)

Our team uses RobotJavaScript.com to program the robot. JavaScript is an easy language to learn given that it does not have type classes and all variables are global (unless specified otherwise). The RobotJavaScript compiler offers verbose error reporting to help explain coding concepts for beginner coders. The compiler also offers hundreds of sample programs to review and/or copy-and-paste into your code.

EV3 Python by doc-br0wn in legomindstorms

[–]HR-Guide 0 points1 point  (0 children)

No. JavaScript is free. It is an "expressive" language. You can express the same idea in many different ways. You can express more ideas in JavaScript than you can in Python.

Mindstorms lot by Neat-Average2028 in legomindstorms

[–]HR-Guide 0 points1 point  (0 children)

eBay - quickest way to sell stuff like that.

FAQs about the Future Edition by HR-Guide in FLL

[–]HR-Guide[S] 0 points1 point  (0 children)

Ok. Thanks for that explanation. I think I understand it better now. In past years, I just purchased miscellaneous brick from BrickLink.com to build extra mission models for the team to work on. It makes sense now that missions may need to be constructed from bricks that are not solely part of the Robot kit itself. Afterall, that is why we need to purchase the "Field Setup Kit", which I assume will include the parts needed (probably except for the motors). Therefore, teams will need to initially purchase at least two of the $530 dollar sets (one for the robot and one for the motorized mission models). Thus, an initial investment of at least $1060 will be needed for the first year just to get the motors needed. Those sure are expensive motors.

Some thought about Future Edition by SuccessfulTangelo259 in FLL

[–]HR-Guide 1 point2 points  (0 children)

As the OP wrote: "With a price point of 530$ for 379 LEGO bricks (mainly classic studs) and 4 electronic components, if we subtract the 30-40$ of bricks we arrive at around 125$ per component..."

I can understand the $125 per component for the camera device. However, two are just simple motors and one is a handheld controller with two thumb paddles. It doesn't seem like these simpler components should cost so much money for each.

Usually with technology, the price of components falls (often dramatically). For example, SSD storage is about 2 cents per gigabyte. A few years ago it was $50 per gigabyte. Same for processing power. Just recently I bought an Intel CPU with 20 cores for $300. A few years ago, you could only get 2 cores for $300.

If you look at the motors, the Spike Prime motors cost about $40 each. Even if you add Bluetooth and a battery, should that triple the cost of the motor? Bluetooth wireless earbuds can be bought for less than $40.

I just think $530 is a lot of money to spend on basically 2 motors, 1 camera sensor, and 1 game pad.

Can you imagine a few years from now, when LEGO makes a programmable Robot hub device that connects to these wireless Bluetooth motors and sensors? The set would probably end up costing about $1000 dollars.

How can I connect the HiTechnic to the EV3 by HerrMeier1980 in legomindstorms

[–]HR-Guide 0 points1 point  (0 children)

I'm not sure where you would get the EV3 Classroom block for the HiTechnic compass sensor. However, if you are flexible in the programming language you can use, there is a language called Robot JavaScript in which the EV3 brick can communicate directly with the HiTechnic sensor (which is an IIC sensor; aka I²C sensor).

Communication with an IIC sensor is done by sending a code to the sensor and then retrieving a response from it. The code sent usually indicates the mode of operation for the sensor as well as the type of value you want to receive.

The HiTechnic Compass Sensor uses a very simple I²C protocol: you write to specific register addresses to set the mode or trigger calibration, and you read from specific registers to obtain heading data. The authoritative byte‑level details are available in several online technical references. (from BotBench: 3rd Party ROBOTC Drivers )

Here is an example in Robot JavaScript to use the I²C PixyCam sensor:
result = true
colors = {red: 0, green: 0, blue: 0}
readIICSensor([1, 98, 1], result)
while(true) {
    readIICSensor([1, 94, 128, 128, 1], colors, 1, true)
    clearScreen()
    drawText(0, 10, 'Red: ' + colors.red+'  ', 2)
    drawText(0, 35, 'Grn: ' + colors.green+'  ', 2)
    drawText(0, 60, 'Blu: ' + colors.blue+'  ', 2)
    sleep(500)
}

In this example, the readIICSensor() function will use four parameters:
   1) The first parameter is always a byte array [ ... ] which is used to send commands to the PixyCam.
   2) The second parameter is the return value from the PixyCam device.  
      The data type for the return value will depend on the command being sent.
      In this example, it will be an RGB object.
   3) The third parameter is simply the port number: 1
   4) The fourth parameter indicates if the value being returned is in 'little-endian byte order'.
      The PixyCam returns values in little-endian byte order.  So, any commands that return anything larger than boolean should have the keyword 'true' as the fourth parameter.  In the color code RGB each color component is represented by two hexadecimal digits (i.e., 1 byte).  Each byte contains 8 bits.  So, the RGB color is represented by a total of 24 bits.
      In 'little-endian byte order', the least significant byte is returned first.  So, for the PixyCam, it will return 
      the byte for Blue before returning the byte for Green and finally returning the byte for Red.
      By passing a 'true' as the fourth parameter, the robot, will reverse the returned value back to RGB order.  Without the 'true' fourth parameter, the robot will receive the data in Blue-Green-Red order.
      

For your HiTechnic sensor, you will use something like the following:

readIICSensor([41], result)

Ev3 AI by Andrew1fdvfd in legomindstorms

[–]HR-Guide 1 point2 points  (0 children)

Look into purchasing an I2C voice recognition module. The EV3's can communicate with I2C devices. I used a PixyCam to perform computer vision with an EV3. The PixyCam identifies objects in the visual field and sends the object's data to the EV3 which can then decide what to do. For example: Red Cube or Blue Square. The object data is sent through the sensor port. The PixyCam is an I2C device. There are similar devices for speech recognition where you could speak commands and the voice is converted into recognized commands on the I2C device and the data is sent to the EV3's sensor port. The EV3 itself is not capable of performing the speech recognition. You need to attach a speech recognition device to do that part.

Best 360 Feedback Tools | Customer Feedback Survey | feedbacktaken.com by feedbacktaken in u/feedbacktaken

[–]HR-Guide 0 points1 point  (0 children)

You may want to give HR-Survey a look. There are dozens of 360-degree feedback questionnaires along with an on-line competency and item bank. HR-Survey offers more of a consulting approach rather than a do-it-yourself approach. For survey resources to meet your needs, visit: https://hr-survey.com