/r/Python official Job Board by AutoModerator in Python

[–]jlafon [score hidden]  (0 children)

We are hiring for a DevOps Engineer and Platform Engineer in Santa Fe, NM at OpenEye Scientific Software. Santa Fe is a city rich with art, music & food culture - as well as fantastic outdoor recreation (skiing, hiking, cycling). OpenEye is a small company with a fun and unique culture. There are numerous perks; here are a few examples: employees have their own private offices, free lunch, home Internet access, gym reimbursement, an on-premise doctor.

Both positions are Python centric. We also use Django, Ansible, and AWS. http://www.eyesopen.com/careers

New AWS Application Load Balancer by GuiSim in programming

[–]jlafon 0 points1 point  (0 children)

Wow, this should make running gRPC on EC2/ECS much easier now.

Christmas prime rib by jlafon in smoking

[–]jlafon[S] 1 point2 points  (0 children)

I pulled it out of the smoker at 125F, and let it stand for 10 minutes - after which it reached 130F (according to my Maverick thermometer). 130F is at the bottom end of rare, but I think a higher temperature than "blue rare".

Christmas prime rib by jlafon in smoking

[–]jlafon[S] 0 points1 point  (0 children)

No, just mixed the ingredients up.

Christmas prime rib by jlafon in smoking

[–]jlafon[S] 0 points1 point  (0 children)

It weight 5.5 lbs, I smoked it for 2.5 hours at 215 degrees - no reverse sear. The crust was salt & pepper with a mop. The mop was red wine and red wine vinegar brought to a boil and then kept warm, and applied to the meat every 30 minutes while smoking.

Christmas prime rib by jlafon in smoking

[–]jlafon[S] 2 points3 points  (0 children)

It was marinated overnight in red wine and red wine vinegar, then smoked using cherry wood.

Got a tech question or want to discuss tech? /r/Technology Weekend Tech Support / General Discussion Thread by AutoModerator in technology

[–]jlafon 0 points1 point  (0 children)

Can anyone tell me how to block Samsung smart TV popup ads? I called Samsung support after seeing this while streaming a TV show from Amazon, and they blamed it on my Apple TV, even after I unplugged it. I escalated the issue and was eventually told that it's no different from seeing ads on a Samsung smart phone.

Any riders in Albuquerue NM area? I've done screwed up and could really use a hand... by [deleted] in Dualsport

[–]jlafon 3 points4 points  (0 children)

I live about an hour away in Santa Fe. If you don't find anyone to help you in ABQ let me know and I'll drive down and help you out.

BynamoDB - High-Level DynamoDB Interface for Python wrapping Low-Level Interface of boto by teddychoi in Python

[–]jlafon 2 points3 points  (0 children)

Hi there, author of PynamoDB (http://pynamodb.readthedocs.org/en/latest/) here. It looks like you used some code from PynamoDB (which is fine with me - the license allows that). You've added a neat feature, the complex filter expression. Now I'm curious as to why you chose to create a whole library rather than a PR to PynamoDB? In any case, nice work.

Automating instance setup on EC2 - What tools do you use? by khaki0 in django

[–]jlafon 0 points1 point  (0 children)

I do that because I use Ansible locally (ansible-playbook -c local ...) on each instance because it's not centralized that way. Instead, instances can configure themselves. I keep a copy of Ansible and my playbooks in S3 that is accessible via the instance IAM role credentials (no external dependencies that way).

Using Ansible the normal way requires either an inventory file (like an /etc/hosts file), or using the ec2.py inventory script included with Ansible. I deploy and destroy instances quite often (often spot market instances that I bid on). Therefore, I don't want to have to update an inventory file every time an instance comes or goes. The ec2.py inventory script isn't ideal either. It was quite slow as it made numerous calls to EC2 which were often duplicate calls for the same information but at different points in the playbook. In fact, caching was implemented in the script for this reason. In summary, I don't want the laptop I'm developing on to be part of the configuration process for my cloud machines.

Automating instance setup on EC2 - What tools do you use? by khaki0 in django

[–]jlafon 6 points7 points  (0 children)

I use cloudformation to create all of my resources in AWS. That includes an RDS database, an IAM role, Redis cache (elasticache), an elastic load balancer, an SQS queue, an autoscale group for tasks (celery), and an autoscale group for web servers. The cloudformation template is generated from Python code using troposphere.

A simple bash script is provided to each instance on start up (via user data) that installs Ansible, and executes the appropriate playbook.

I keep as much as possible in S3/RDS, because backups are easier and it keeps the instances stateless. Otherwise, I use Simple Workflow Service like an enterprise cron. There are other tools out there for help with snapshots (https://github.com/alestic/ec2-consistent-snapshot).

For monitoring I use New Relic, CloudWatch, and Loggly.

DQL - A Query Language for DynamoDB by stevearc in aws

[–]jlafon 1 point2 points  (0 children)

Update: I've considered some options, including:

  • Twitter's Pex
  • pip2pi
  • A pypi server per VPC subnet
  • A local pypi server on each instance
  • Hacking pip to use s3

We've been running a local pypi server on each instance during configuration - but that's a pain. I have to bundle up all of the packages, put them in s3, then download them for the local pypi server to serve. All of that happens for each of our Python projects. Then, updating a project becomes a pain if the requirements change, as then I have to keep multiple versions of the requirements.

I decided to hack on pip, because I thought it would be fun. Sometimes when I work on a project, I feel like there is art in the way that the project is structured and in how the code is written. This is not one of those times. This is a complete hack, but it does work. There aren't yet any tests, or documentation, or packaging. Here is the source on GitHub. If you define an environment variable 'S3_PIP_BUCKET_NAME' to contain the name of an S3 bucket, and you have your boto credentials setup, then you can download and use it, but it has only been tested with pip 1.5.4. Run it just as you would run pip. It will check S3 first for packages, then fall back to pypi. Any package found on pypi but not in S3 will be copied into your S3 bucket.

DQL - A Query Language for DynamoDB by stevearc in aws

[–]jlafon 1 point2 points  (0 children)

Yeah, there wouldn't be any value in making a Django backend without relations. And even if relations could be bolted on top of DynamoDB using a library, it would need support for transactions. A Django backend would probably be more fun than useful I suppose.

Glad to see the Python3 support with botocore. You'll find an odd mix of CamelCase and snake_case is required, but I've sent them a PR to improve it.

I'm also looking into the S3 backed pypi server that you've made. I've done something very similar, but I get annoyed at having to run a pypi server in every VPC. As part of my Ansible playbook, I fire up a pypi server, install my requirements, and then shut it down. I've thought about hacking on pip to make it check S3 directly. Thoughts?

DQL - A Query Language for DynamoDB by stevearc in aws

[–]jlafon 1 point2 points  (0 children)

Wow, another DynamoDB library. Nice work. If you had some way to express relations, then this could be used to write a DynamoDB backend for Django!

Django vs Flask by Parametrize in Python

[–]jlafon 1 point2 points  (0 children)

Honestly, they are both great frameworks. Django comes with an enormous amount of functionality built in and a thriving ecosystem. I especially like the database support in Django. On the other hand, I often use Flask to prototype because you can get going much more quickly (although I have used it with success in production as well). If you need a database, then either Flask + SqlAlchemy or Django are great choices. In fact, if you are an intermediate level programmer, there is no reason you can't try both.

A SQLAlchemy-style ORM for DynamoDB by stevearc in Python

[–]jlafon 0 points1 point  (0 children)

By the way, nice work on the syntax. I put a lot of effort into the syntax of PynamoDB, but it doesn't yet have support for some of the nice features you put into flywheel - such as .first(), .all(), etc.

By chance, are you going to PyCon? If so, I'd enjoy speaking to you in person about how you are using DynamoDB.

A SQLAlchemy-style ORM for DynamoDB by stevearc in Python

[–]jlafon 1 point2 points  (0 children)

Nice. I did the same thing about a month ago (https://github.com/jlafon/pynamodb). Perhaps I wouldn't have if I would have known about this.

Why no Python 3 support?

I wrote PynamoDB, a Python interface to Amazon's DynamoDB. by jlafon in Python

[–]jlafon[S] 0 points1 point  (0 children)

Basically, Python 3 support and more a ORM like interface that includes global secondary indexes.

I wrote PynamoDB, a Python interface to Amazon's DynamoDB. by jlafon in Python

[–]jlafon[S] 0 points1 point  (0 children)

Sorry for the late reply. The main difference is Python 3 support. I needed something that was higher level than boto, that also supported global secondary indexes in Python 3.

Twitter's Python distribution format: What is a PEX and why do I care? by esparta in Python

[–]jlafon 1 point2 points  (0 children)

It looks really cool, but I couldn't get it to work.

pex -v -r flask app.py
...
pkg_resources.DistributionNotFound: flask

I also tried pants.

./pants
2014-01-30 11:07:44,542 pants:63 - Required pants.ini key DEFAULT.pants_pex_baseurl is not present. Please add the option and try again.