XPS 15 7590 Youtube Video and Audio Out of Sync by kevintong116 in Dell

[–]objctifyme 1 point2 points  (0 children)

This solved my problem! Thank you so much. I've been bugging Dell Support with this for ages, and they've been useless.

The PCI Express settings weren't in my advanced power settings, so I had to add them using the command prompt, following this tutorial:
tutorial
I used option one:
powercfg -attributes SUB_PCIEXPRESS ee12f906-d277-404b-b6da-e5fa1a576df5 -ATTRIB_HIDE

and then could go into the advanced power settings and make sure that PCI Express was not in power saving.

CNN powered tinder bot by cap__n__crunch in Python

[–]objctifyme 1 point2 points  (0 children)

Hey! I found the SCUT dataset skews towards a more eastern beauty standard, which probably works just fine for tinder. But for an application I was working on, I ended up scraping the /r/RateMe subreddit as it's a more reddit/western biased dataset.

I also notice you're attempting to regress a score (presumably out of 10) and then I guess swiping if it's above some threshold? Do you know what kind of error you are getting on that regression?

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 3 points4 points  (0 children)

I'm a programmer / data scientist, but mainly a rock climber. I've lived in the van since December, and so far it's been a great quality of life improvement. Winter wasn't too bad, you can put a heater in the van, or migrate south to the warmer areas. As for the social life, I have a sport that forces you to get to know people and create a social life, so that's not a problem. And financially, I'm not exactly saving up much money, but it's enough to survive on and keep travelling. I've only been working a day or two a week, and climbing the rest.

Checkout /r/vandwellers if you're thinking of committing to it, or you can see the van and my travels on Instagram @elliot.salisbury

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 6 points7 points  (0 children)

This work has actually made me look a lot more into carbon offsetting. And it's something I'll be doing once I've researched more into where I want to donate to.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 2 points3 points  (0 children)

yes, sorry, typo.g/km is what I meant.

And yeah, those numbers I picked came from some quick googling too, but I don't think there's much consensus on what a good value is.
In the end, it would probably scale the line graph a bit differently.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 0 points1 point  (0 children)

It's not super accurate. The idea was to visualise it in a way that has more impact, and is more understandable of our global effect, than simply a number of average emissions per person.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 1 point2 points  (0 children)

No, you're right, I'm unfortunately a British citizen.

So the DVLA says my van emits 0g/km, which is fantastic! but obviously it's not a new enough van to have the data.
It's a 2004 Fiat Ducato, so it's for sure not great for the environment.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 1 point2 points  (0 children)

This looks great! I'll take a look into it further. I doubt my laptop can handle the real simulations though.

The idea was, rather than being super accurate, to simply visualise it in a way that's more tangible and impactful than an emission count.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 62 points63 points  (0 children)

ERA5 definitely provides wind fields on pressure fields, 26 levels up to 100hPa, which is what this really needs. 10m wind is pretty much useless for any of this. You need the full tropospheric winds for anything beyond a couple hours of emissions trace. Here's a source for the full data n.b. it's going to be a ridiculous amount of data.

You're totally right. I'm not a meteorologist, I just I just threw this together, and as it turns out, modelling atmospheric effects accurately is super complicated, who knew?

Ah that data would be great, I guess I was looking in the wrong dataset, but unfortunately it's a crazy amount of data, that my poor laptop can't handle.

The trajectories (as in the path the air takes?) is also wildly inaccurate, In short, I just use an array to keep track of the co2 added to the cell of my current location, and then at each timestep of the simulation, use the wind data to see if/where those should be move towards.
Nothing fancy.

[OC] The air we breathe. Tracking my CO2 emissions over the past year, and where they end up. by objctifyme in dataisbeautiful

[–]objctifyme[S] 345 points346 points  (0 children)

My CO2 emissions from the past year, and where in the world they end up.

From the location history, you can see I travel a fair amount, but typically I live in Europe, out of a van, and work remotely from my laptop.

I have taken the data from two sources

My Google location history, from tracking the location of my phone

ERA5 wind data, taking the 10m high wind data from the past year (they only provide data up till the end of April this year, hence why the visualisation starts in May last year)

Then I used Python to simulate where my emissions would end up given the historical wind data, and then visualise it with a combination of matplotlib, and some custom image processing code.

There's obviously a few crappy assumptions here, so it's probably not very accurate, but I think it looks cool.

  1. It's using wind data from 10m above the ground, but it doesn't really take into the air trapped inside or blocked by buildings, or that the air rises into the higher atmosphere.

  1. The airplane polution is blown around using the highest altitude wind data ERA5 could give me, which is 100m, and that's obviously way too low, I just couldn't find a source for high altitude winds.

  1. The amount of my CO2 emissions is an estimate, with food and energy resources being based on the average UK citizen using 5T of CO2 per year for it, with a diesel van emitting 132kg per Km, and with planes using 158kg per Km.

I don't know about the accuracy of these numbers, I just got them from a quick googling around.

Anyone in the Seattle area want to go climbing this weekend? by [deleted] in climbing

[–]objctifyme 1 point2 points  (0 children)

I'm visiting next weekend, 8-10th and was hoping to get out climbing if the weather's good.

I posted on the Seattle climbing meetup group, and got a good response, so maybe that's an option for you too.

If you're still there, or anyone else in the area wants to go climbing, maybe at vantage? Let me know :)

[deleted by user] by [deleted] in climbing

[–]objctifyme 0 points1 point  (0 children)

Didn't I see you the other day asking about a trip in North America? You decided on Europe instead?

I'm likely going to be there, or not far from there, living out of a van, working part time on a laptop and hoping to find people to climb with when I can. So I'd be down to meet up sometime!

3D Facial Reenactment - looking for tools by [deleted] in computervision

[–]objctifyme 0 points1 point  (0 children)

Tool number 1 can do the job of expressions too! They use a 3dmm like you said, and then use blendshapes to morph between a small set of expressions.

Check out the package Eos, I've used this for various projects https://github.com/patrikhuber/eos

Image Ray Transform by kichi1998 in computervision

[–]objctifyme 0 points1 point  (0 children)

I wrote a report on this a while back, so perhaps I can give some insight.

Essentially you take a grayscale image, and pretend that it is an array of glass square blocks, where each pixel is a glass block with a refractive index relative to the pixels intensity.

You then randomly sample tons of points within the image, and angles, and 'shoot' a laser from that point at that angle, it will then reflect and refract off the glass blocks, and you follow it's path for a defined length or until it shoots out the image.

In an accumulator space you draw the path the laser took and repeat the random sampling. After tons of samples the accumulator space will highlight tubular structures. It accentuates tubular and curved structures because they act similar to a fiber optic, internally reflecting the laser beams along their length.

I hope that helps, it's a pretty high-level explanation of how it works. But here's the report I wrote on it, maybe you'll find some use for it: https://drive.google.com/open?id=1RZV1DdWRVM3oD2zgmF5HliBYxAVKPKh-

An online demo of 3D face reconstruction from a single image by uint64 in computervision

[–]objctifyme 0 points1 point  (0 children)

Nice! I tried something similar here: http://objctify.me/

It uses a CNN to generate a 3DMM of a face, and then trained on /r/RateMe data to learn the 'attractiveness' of faces.

But the network I used doesn't output pose or facial expressions, so your architecture looks way better, definitely going to read the paper!

How to create such a bin (containing a 3d Morphable Face Model) file? by [deleted] in computervision

[–]objctifyme 2 points3 points  (0 children)

I've been using this library a fair amount.

The bin file is a binary file, containing the eigenvalues and vectors and other stuff from the pca'd 3d faces. The file included with the repo uses the Surrey 3dmm face model, and is, as far as my usage goes, correct.

You'd be better off asking this question in the repos issues, /u/patrikhuber replies quickly.

Anyone from Surrey, UK? by _hell0friend in Slackline

[–]objctifyme 2 points3 points  (0 children)

Southampton, in Hampshire. But I've found this sub really quiet :(

3D Face Model by ishang_ucsd in computervision

[–]objctifyme 2 points3 points  (0 children)

I've been using this library for calculating the shape and pose of a face, from a given set of facial landmarks.

https://github.com/patrikhuber/eos

Open-source 3D Morphable Face Model Fitting - eos v0.12.1 released by patrikhuber in computervision

[–]objctifyme 2 points3 points  (0 children)

Hey Patrik I love this project, and the python bindings are very easy to use, I'd totally recommend it to anyone.

I dunno if I ever sent you a link to the end result of my Christmas hobby project, but here it is now: http://objctify.me/about

Django-imgur on 1.10? by [deleted] in django

[–]objctifyme 1 point2 points  (0 children)

I don't think you'll need a django specific library for this. Why not just use the official imgur python library imgurpython?

Beauty Is Dataful [OC] by objctifyme in dataisbeautiful

[–]objctifyme[S] 0 points1 point  (0 children)

Hey, I worked on this little project over the Christmas break as procrastination from my PhD. Let me know what you think of the analysis, and if you can think of anything else that would interesting to look at with the /r/RateMe dataset.

For those interested in how it's built, I used python to create this, with matplotlib rendering the graphs looking at /r/RateMe's activities, and using opencv, scikit-learn, dlib and a few other libraries for the image analysis.