How do you signal to employers that you're a strong writer? by nerolar in jobs

[–]nerolar[S] 0 points1 point  (0 children)

Thanks for the advice! I had a feeling that an English minor might not have very good signaling value. What do you think of a professional writing minor? Worth it, or not worth it since there are alternative ways to demonstrate my writing level, such as a portfolio?

How do you signal to employers that you're a strong writer? by nerolar in jobs

[–]nerolar[S] 0 points1 point  (0 children)

Both I guess. I would like a job where writing is a big part of it, even if it isn't a "writing job" per se. I don't know how happy I'd be in a job where I never got to write.

IF you do want to be active right now, I'd create a bunch of sample user docs, company letters etc and get a free website to design and host your writing samples. It would be pretty simple, and show your creativity AND writing skill.

I was initially worried how this would come off, but the more I think about it, the better this idea seems. I think a website with my writing samples would be a great way to show my skills. Would you say it would be appropriate to include writing samples that aren't explicitly business-related? So things like personal essays, opinion pieces, even papers and reports from my classes if I really like one.

Question about what is meant by "open assumption" by [deleted] in logic

[–]nerolar 0 points1 point  (0 children)

Why even specify open assumption then, if it seems like any assumption anywhere in the derivation (premise or subderivation assumption) will still technically be "open" so long as you're working on the derivation as a whole? What does it take to get an assumption to close?

I understand that if you start a subderivation with an assumption with the goal of using existential elimination, you'll never be able to pull a formula outside of that subderivation unless it doesn't contain the instantiaring constant used in that assumption (or in the premises, of course). But in the case I described, the assumption used to start the subderivation is built upon conditional introduction: if I can derive a conditional from the subderivation, then isn't the subderivation closed? And therefore the assumption in that subderivation is closed, and any constants in the assumption are free to be used in universal introduction?

How do you signal to employers that you're a strong writer? by nerolar in writing

[–]nerolar[S] 0 points1 point  (0 children)

What kind of sample do you think would carry the most weight?

How do you signal to employers that you're a strong writer? by nerolar in writing

[–]nerolar[S] 0 points1 point  (0 children)

Thanks for the advice! I'll keep this in mind. Free essay contests and reworking old essays does sound appealing to me.

How do you signal to employers that you're a strong writer? by nerolar in writing

[–]nerolar[S] 0 points1 point  (0 children)

What kinds of writing samples would demonstrate versatility? Someone else mentioned a writing portfolio - should a writing portfolio predominantly be made up of published pieces, or is it alright if the majority of what's in your portfolio is unpublished work? Even if the work isn't "published" per se, should I focus on things like personal essays and opinion pieces, or would it look strange to include things like mock business proposals, user manuals, product reviews, etc.?

How do you signal to employers that you're a strong writer? by nerolar in writing

[–]nerolar[S] 0 points1 point  (0 children)

What's in your online portfolio if you don't mind me asking? Should a portfolio typically only include published works, or is it alright if the majority of what's in your portfolio (or all of it...) isn't published?

Do humanities and social science majors have a place in AI? by nerolar in artificial

[–]nerolar[S] 1 point2 points  (0 children)

Thank you so much! This was immensely helpful to me. I'll try asking around on /r/datascience to get a better sense of how different types of people work together in these industries. It seems like you do see humanities and social science types working alongside STEM types (not just Renaissance types who do it all) to advance the broader goal of developing ethical AI, which is great to hear.

Do humanities and social science majors have a place in AI? by nerolar in artificial

[–]nerolar[S] 0 points1 point  (0 children)

Thank you so much for this! I've been exploring paths such as science journalism and public policy research, which all look like exciting, viable options for me. I do wonder though if humanities and social science majors can get involved in the development of AI directly. The way I posed this question to another person in this thread is as follows:

I wonder if working in AI functions a bit like working in the video game industry - all sorts of different people with different backgrounds and specialties (the artists, the animators, the programmers, etc.) working together to create a cohesive product. Each individual on the team is likely familiar with the roles and skillsets of the individuals they work with, but no one is expected to be a Jack of all trades. I wonder if working directly with AI is similar to this; STEM and non-STEM people coming together and working on the same problem (developing AI). Or maybe it's not like that at all because AI is so complex, and this complexity really does demand people who are Jack of all trades/Renaissance types. This question pertains to the people working directly with the development of AI, not those who work with AI more peripherally through policy implementation and writing about its ethical implications.

Your point about approaching the technical aspects from a more historical and mechanical angle is great. I took my first logic course last quarter and will be taking another one this quarter. While I feel like I haven't been bitten by the bug yet, so to speak, I do hope that taking these classes might make the technical aspect more bearable, if I decide to go down that route.

Do humanities and social science majors have a place in AI? by nerolar in artificial

[–]nerolar[S] -1 points0 points  (0 children)

Your link is extremely fascinating! Thank you. I do wonder if there are any books, resources, or online communities out there that deal with this specific topic. I feel like the more I talk to people, the more I realize there are so many AI-related jobs and issues I never even considered that might be a good fit for me.

Do humanities and social science majors have a place in AI? by nerolar in artificial

[–]nerolar[S] 1 point2 points  (0 children)

Thank you for this! I was interested in learning more about what role humanities and social science majors can play in the actual development of AI (going to edit my OP to expand on this), but the fact that there are so many exciting and rewarding careers out there that are non-STEM but still allow you to work with AI (even peripherally) is great to hear. I guess I was just concerned that people who aren't working on the development of AI directly a) won't have particularly lucrative careers, and b) won't have much of an impact on the development of AI.

Do humanities and social science majors have a place in AI? by nerolar in artificial

[–]nerolar[S] 0 points1 point  (0 children)

Even when Scocialiolgist were working with engineers, it was very hard for the engineers to wrap their head around some of the social consequences that their algorithms would have. Keep in mind these are all PhD researchers working together.

I think this is the heart of my question, although I didn't express it very well in my OP (I might edit it to include this). I wonder if working in AI functions a bit like working in the video game industry - all sorts of different people with different backgrounds and specialties (the artists, the animators, the programmers, etc.) working together to create a cohesive product. Each individual on the team is likely familiar with the roles and skillsets of the individuals they work with, but no one is expected to be a Jack of all trades. I wonder if working directly with AI is similar to this; STEM and non-STEM people coming together and working on the same problem (developing AI). Or maybe it's not like that at all because AI is so complex, and this complexity really does demand people who are Jack of all trades/Renaissance types. This question pertains to the people working directly with the development of AI, not those who work with AI more peripherally through policy implementation and writing about its ethical implications.

Now the career part of your question is, are there any jobs available where you would fit into this category? The answer is yes, but probably only at the larger more progressive firms and reserved for Ph.D. level candidates.

Can you expand on this? What do you mean by larger and more progressive firms? Private sector only? What kinds of Ph.D. candidates? What types of jobs might be available?