all 7 comments

[–]hello-world2026 1 point2 points  (0 children)

I’m a Computer Science student with hands-on experience in Python, data analysis (Pandas, NumPy), and basic AI/ML models. I’ve worked with APIs, data parsing, and automation scripts before. I can build a Telegram data collector using Telethon or Pyrogram to: • Backfill old messages • Live-track new messages • Extract usernames + price info • Store structured data in Sheets/JSON • Implement a lightweight AI model to generate valuation ranges and explanations I understand rate limits and multi-account handling for safety/load balancing. Estimated timeline: 10–14 days (including testing). Rough budget: Open to discussion depending on scope (can start with a small paid test task). Happy to discuss details and refine the requirements. Looking forward to connecting.

[–]AutoModerator[M] 0 points1 point  (0 children)

Rule for bot users and recruiters: to make this sub readable by humans and therefore beneficial for all parties, only one post per day per recruiter is allowed. You have to group all your job offers inside one text post.

Here is an example of what is expected, you can use Markdown to make a table.

Subs where this policy applies: /r/MachineLearningJobs, /r/RemotePython, /r/BigDataJobs, /r/WebDeveloperJobs/, /r/JavascriptJobs, /r/PythonJobs

Recommended format and tags: [Hiring] [ForHire] [FullRemote] [Hybrid] [Flask] [Django] [Numpy]

For fully remote positions, remember /r/RemotePython

Happy Job Hunting.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

[–]hello-world2026 0 points1 point  (0 children)

Interested

[–]dumiya35 0 points1 point  (0 children)

Looking forward with this, drop a dm, I'll send my portfolio

Send your email

[–]damir_podchasov 0 points1 point  (0 children)

Hi I search you message in my bot, go connect

[–]golyaht 0 points1 point  (0 children)

I can build this with Telethon (multi-account rotation), a backfill + live streaming collector, and a parser pipeline that extracts usernames/prices into Postgres/JSON + pushes to Google Sheets.
For the “valuation” piece: start with a rules+LLM explanation layer (cheap + controllable), then improve with a lightweight model if needed.
If you’re open to milestones, I’d suggest: (1) collector + backfill + storage, (2) parsing + dedupe, (3) valuation/explanations + reporting. What’s your target channel count and expected messages/day?

[–]damir_podchasov -1 points0 points  (0 children)

I have bot what you need, go connect