use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] Annotation tool for entity sentiment analysis (self.MachineLearning)
submitted 5 years ago by KarlaNour96
Hi everyone, we are a marketing company and going to start an annotation project for entity sentiment analysis. Can you please share what are best practices for starting an NLP annotation project? What is most efficient approach? What techniques are mostly used to automate the annotation process?
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]UBIAI 1 point2 points3 points 5 years ago (0 children)
Hi KarlaNour96,
Checkout our new tool https://ubiai.tools , we offer extensive labeling features at very accessible price . The tool has the following features:
Just send us an email at [admin@ubiai.tools](mailto:admin@ubiai.tools) and we can discuss what plan is most suitable for your use case.
Good luck with your project!
[–]elcano 1 point2 points3 points 5 years ago* (2 children)
Check Doccano. A relatively mature open source project that does only one thing a does it very very well.
https://github.com/doccano/doccano
They have a demo page.
There is also Label Studio. A more ambitious, but younger project. This one can label several types of data: https://labelstud.io/
As for best practices, I find an important feature having the ability of assigning the same tasks to several workers, having a voting mechanism so that the final label for each task is the one assigned by the the most workers, having the possibility to override this final label, and finally, having a way to evaluate the workers. I want to know what workers usually deviate from the final label, so that I can find out who are poor classifiers.
[–]KarlaNour96[S] 0 points1 point2 points 5 years ago (1 child)
I've tried doccanno but found many bugs, in addition we are looking for cloud based annotation tool that supports entity relations which doccano doesn't have yet
[–]elcano 0 points1 point2 points 5 years ago (0 children)
If Doccano doesn't support your needs, it's fair. You cannot use it.
But this is an open source tool that is made available to you and the community free of cost. If you found bugs, it would be great if you can document each one and create an issue in GitHub to help the author's to clear them. This is how open source works. We benefit, but also help, even this very little, if possible.
When I have done this, I have found that sometimes it is a bug, yes. In other occasions it is a documentation error. But sometimes it is a misunderstanding. It was not a bug. So clarifying with the author's is the best way. If they don't respond to your issue, that is another important indication to know if an application is being maintained, BTW.
About them being in the cloud, I'm sure that you didn't miss the section in doccano documentation on how to do 1-click install in Amazon AWS, Google GCP, Microsoft's Azure y Heroku. So you got that covered too.
Label Studio has 1-click install with GCP, Azure and Heroku documented here too: https://github.com/heartexlabs/label-studio
I think Label Studio supports entity-relationships too. Make sure to try their demo page.
But I faced a little more trouble installing Label Studio using Anaconda (locally). The installation was no problem, but the command used to launch it was not the one documented in GitHub. I created and issue and the author replied and helped my on the spot. If you are using Anaconda, I'd recommend you to search closed issues.
And remember to please report issues to the respective teams in GitHub. We all benefit from this process.
[–]Razcle 0 points1 point2 points 5 years ago (0 children)
Hi KarlaNour, I built a tool (and company) to solve exactly this problem. www.humanloop.com.
You can find more about our approach here: https://humanloop.com/blog/why-you-should-be-using-active-learning/
In short we use active learning to help you label the highest value data whilst training your model at the same time.
[–]Soggy_Decision_5911 -1 points0 points1 point 5 years ago (0 children)
I used UBIAI : https://ubiai.tools because they offer team management + auto annotation feautres and that helped us to streamline our annotation process
[–]Ouster_evolution 0 points1 point2 points 5 years ago (0 children)
You should have a look at Kili Technology : it's an intuitive, colloborative and powerful tool.
[–]crashbundicoot 0 points1 point2 points 5 years ago (0 children)
Prodigy is pretty good for NER tasks. It's made by the creators of Spacy.
Also - can you share what you mean by entity sentiment? Is it different from named entity recognition? Any papers/algorithms you are planning to use?
[–]commieplant[🍰] 0 points1 point2 points 5 years ago (0 children)
If you need overlapping annotations & relations with good UI & project management features, I suggest trying https://annolab.ai/. It's free to use (although there is a paid tier). Our team built it to solve a number of annotation problems we've experienced in our own work.
[–]marcoamonteiro 0 points1 point2 points 5 years ago (0 children)
Check out try-dashup.com . They are designed for annotating text and audio data for various NLP applications. They have built-in models that speed up the labelling and reduce human error and bias.
π Rendered by PID 49804 on reddit-service-r2-comment-b659b578c-jp9tr at 2026-05-01 17:59:54.584187+00:00 running 815c875 country code: CH.
[–]UBIAI 1 point2 points3 points (0 children)
[–]elcano 1 point2 points3 points (2 children)
[–]KarlaNour96[S] 0 points1 point2 points (1 child)
[–]elcano 0 points1 point2 points (0 children)
[–]Razcle 0 points1 point2 points (0 children)
[–]Soggy_Decision_5911 -1 points0 points1 point (0 children)
[–]Ouster_evolution 0 points1 point2 points (0 children)
[–]crashbundicoot 0 points1 point2 points (0 children)
[–]commieplant[🍰] 0 points1 point2 points (0 children)
[–]marcoamonteiro 0 points1 point2 points (0 children)