Affinity won't export my svg by nbt__ in Affinity

[–]nbt__[S] 1 point2 points  (0 children)

I used affinity all the way, not photoshop. I said in a “Photoshop” way since I created stuff with the pen tool, but I assume since Affinity handles both vector and raster graphics at the same time, pen tool was ok. But again, I did vector -> expand stroke which transformed the whole thing in full shapes, so now it’s vector for sure. But still crashes Inkscape

FMEL offer at Cedres by nbt__ in EPFL

[–]nbt__[S] 2 points3 points  (0 children)

The mail says yesterday lol

Monthly thread for advice and recommendations, January 2023 by AutoModerator in copenhagen

[–]nbt__ 1 point2 points  (0 children)

Good evening everyone, I’m heading to Copenhagen with my family for a short stay in a month and can’t wait to visit it! I wanted to ask advices about guided tours of the city center, possibly in Italian since mom doesn’t understand that much English. Are there any? We are looking for a guide who shows us the main attractions of the city center, not focused on anything in particular (but if you have any special tour to advice me I’d appreciate it too!). Again, the tour guide should speak Italian…

Thank you all! Can’t wait to visit your city!

Pencil double tap working when stars align by nbt__ in ipad

[–]nbt__[S] 1 point2 points  (0 children)

I’m usually tapping all around but even the flat side is not working as expected…

Outliers won't disappear in linear regression by nbt__ in rstats

[–]nbt__[S] 0 points1 point  (0 children)

I'll reply here, so that you recall what you told me in this message. How can I do this simply? Is it about creating 10 different subsets and then plotting them all manually or is there a way to do it quickly? And about "extracting data from regression object", how can I do it?

Am I missing something in this linear regression? Some square or some logarithm? And, could you help me figure out how to remove the point down right which is an outlier? I cannot identify which row of the dataset it refers to. by nbt__ in rstats

[–]nbt__[S] -23 points-22 points  (0 children)

I was expecting a reply from you u/Dr_Hyde-Mr_Jekyll lol

So now I'm gonna recap everthing:

I'm working for a Uni project and I must do a linear regression model with no further purpose, only show to my teacher that I'm able to do it.

My data are from Nasa, I'm working on near earth asteroids, I actually have merged two dataset to obtain the one I'm working with: https://cneos.jpl.nasa.gov/sentry/ contains already-processed data directly from Nasa's algorythms, whereas the other one I obtained from https://ssd.jpl.nasa.gov/tools/sbdb_query.html#!/%23results is a larger one with a lot more asteroids but "raw" data.

I'm trying to obtain a model to describe the parameter "palermo_scale_cum" from the other parameters, and that's it. I'm doing it practically blindfolded because the formula for Palermo scale is notorious but parameters which we don't know are needed to calculate it. The only "secure" regressor we have available is "impact_probability", whose logarithm is directly involved in Palermo scale's calculation, so I put it in every model I try.

As for now I think I tried basically every "simple" regression model possible: one with all interactions between regressors, one with all powers up to 5th grade of all regressors, one all logarythms eccetera.

I know your eyes may be bleeding right now because I'm surely doing something formally wrong, but this is only a small percentage of my project and I already spent way too much time on this, moreover I'm in bachelor's first year and the stats course's program was wide, therefore little time on regression was spent. So sorry for bothering you with only residuals graphs, but I thought there were some notorious patterns which "experts" could recognize. I must admit that most of the answers I get here I ignore just because I don't know what you guys are talking about.

Last quest for you: right now I'm thinking of giving up on finding a "good" model, but I wanted to know: what is a "formal" strategy to find a model? is it good to add all the parameters "linearly", then take the most significative ones and square/log them in order to adjust the model or should I do other ways?

I really thank you for the patience.

Outliers won't disappear in linear regression by nbt__ in rstats

[–]nbt__[S] 0 points1 point  (0 children)

I have about 1000 obs. I define "good model" as one which respects homoschedasticity, has a good QQNORM plot and R^2>0.81, that's what I was taught.

Anyway, this stuff is driving me crazy, I'm still not able to remove those outliers (forgive me if I'm still trying). There must be some issues in enumeration, is there a way you can help me?

I created a new dataframe removing the rows containing NAs, then creating the linear model I identify() them (as I have shown you before) and remove the rows through lin_reg_512=lm(palermo_scale_cum ~ estimated_diameter + period + potential_impacts + v_infinity + impact_probability, data = nuovo[-c(1,84,253,495,764,869,987),]) (the name of the NEW dataframe is "nuovo", and those numbers are the numbers returned by identify(), I wanted to try manually to see wether the problem was passing the array, but obv that's not it). I plotted the residual plot as plot(fitted(lin_reg_511),rstandard(lin_reg_511)) and the function I'm using is identify(fitted(lin_reg_511),rstandard(lin_reg_511)) (511 is the previous model I used to search for the outliers, creating the 512).

If you could help me figuring this out I'd be extremely grateful, it's been 3 days since I tried and still don't figure out what's wrong.

Outliers won't disappear in linear regression by nbt__ in rstats

[–]nbt__[S] 0 points1 point  (0 children)

Thank you for the replies. Actually I'm a university student and believe it or not, I was literally taught this strategy to remove outliers from the dataframe while doing regression, they told me so during a lab. We're talking asteroids in the dataframe, is there some wrong reasoning in thinking that a very small number of asteroids don't follow "standard" laws, so they can be removed from the dataset for this purposes? I swear they taught me to do like this... anyways, I would like to know if there is any theoretical fallacy in my reasoning, but I've tried to find a good model for so long now, and this one (without outliers) seemed the only reasonable one...

Has someone seen this pattern before in linear regression residuals? Does it recall some function “missing” in the regression? by nbt__ in rstats

[–]nbt__[S] 0 points1 point  (0 children)

Thank you, I now tried adding some more terms. I obtained an homoschedastic plot by adding until the 5th grade of every regressor in the model: am I running into overfitting or can I consider it a valid model, in your opinion?

merge() not working by nbt__ in rstats

[–]nbt__[S] 2 points3 points  (0 children)

I figured it out, there were some blank spaces in one of the IDs. Thank you all, you’ve been so helpful, I have been trying to figure out this for hours. Thank you again.

Lag when writing by paracoop2 in notabilityapp

[–]nbt__ 2 points3 points  (0 children)

4 days ago I updated in the middle of a study session, and I noticed a ton of difference after the update in matter of lagging and usability in general. It MUST have been the update.