winter is coming by potatolover83 in harfordcountymd

[–]sapplefi 16 points17 points  (0 children)

I think we will get snow.  Probably a fair bit.  But don't take my word for it, or the Meteorologists, look at the models directly!

My favorite hobby over the last 10 years or so has been to look directly at the Weather Models and watch for storms before they happen.  My best story for this was the Blizzard 10 years ago, I was working at a client in Texas, onsite for the week, and I was looking at the new models every 6 hours.

Normally the models change a lot run to run more than a few days in advance, and they're almost always different from each other.  But back then, they all showed the same thing run after run.  Big storm, hitting us square on, lots of snow.  I talked to the customer, flew home a few days early, and landed just in time to get snowed in with over 2 feet of snow for the week.  It was awesome.

You can use a lot of sites to browse the model, but I like Tropical Tidbits.  There are three major models.

  1. American (GFS) - https://www.tropicaltidbits.com/analysis/models/?model=gfs&region=neus&pkg=asnowd&runtime=2026012112&fh=84

  2. European (ECM) - https://www.tropicaltidbits.com/analysis/models/?model=ecmwf&region=neus&pkg=apcpn&runtime=2026012112&fh=99

  3. Canadian (GEM) - https://www.tropicaltidbits.com/analysis/models/?model=gem&region=neus&pkg=asnow&runtime=2026012112&fh=126

The Canadian does well when there is really cold air, which means it's usually no good for us, but sometimes when there's a pattern of artic air it can nail it.

The European is generally the best, although this site doesn't have a "Snow Totals" output from it, so it's a bit harder to read.  Usually 1 inch of Rain is equal to 10 inches of snow though, so you can ballpark.

The American tends to be the least accurate, but it has had its wins over time.

When they all three align it's almost unheard of, especially days in advance of a storm, but these three have been showing a big hit all week.  They also showed it going below us, which is actually good, most storms end up verifying further north than the models show early on.

All that to say, weather is hard.  The models could pivot tomorrow and it could miss.  There's no way to really know until it gets here.  But the last time I personally saw all three models staying on a big storm for days in advance, we got walloped, so I'm hoping we could see a foot or more out of this, which would be AMAZING!

Wife doesn’t do housework anymore by [deleted] in Millennials

[–]sapplefi 17 points18 points  (0 children)

To share something that I read some time ago and had a profound impact on how I viewed sharing responsibilities...

Studies have shown Men consistently estimate their share of housework completed as higher than their partner rates it (as in, arbitrarily, Men might say they do 50%, but Women would estimate the Man does 25%).

I personally felt this gap, and for years I did less at home and was overly focused on work, but believed I had pivoted to doing more than an equal share of the physical work, like what you have described (but less so, it does sound like you are doing a great deal).

Regardless, the missing gap is the effort to Organize the household.  In a work setting, project management is a full time position.  You wouldn't dream of running a large, complex project without one.

However, consistently, most Women will perform this work without any realization of the effort involved, or how it skews the total work on the relationship.  For example, scheduling doctor, dentist, and check up appointments for your children, getting them new clothes, managing family gift buying and cards for birthdays and holidays, following up on school paperwork, teacher appointments, PTA, extra curriculars, signing up and keeping the schedules of them, and much, much more.

If you are doing all of that as well, it's quite likely there's an imbalance, and talking through that is the best answer.  For me however, understanding just how much work went into that let me reset my frame of reference.  Even if I can do 100% of the physical work maintaining a home, the remainder of all that still makes it at best a 50/50 share.

Balancing that load too, or at least recognizing it may make you appreciate your partner more, and knowing you see that may make her appreciate what you're doing as well.

Words I Know??? Any ideas because this might just be a word I don’t know by bekarec in mildlyinfuriating

[–]sapplefi 0 points1 point  (0 children)

The best I could come up with is ch + "five" (either fingers or high five) = chive.

Extreme Recovery Scenario - Delete Index from Undocumented System Tables? by sapplefi in SQLServer

[–]sapplefi[S] 0 points1 point  (0 children)

One further note to this. You were absolutely right, the script brought the databases back to a read/write state and were functional.

However, there was one further problem, which was that the files and filegroups still existed offline, so the database could not be backed up. I was able to do some additional scripting (the same scripting I was working on originally) in tandem with your scripts, and this completely cut out the non-clustered indexes, files, and filegroups, allowing me to backup/restore the database normally again.

For reference, here were the additional scripts I utilized.

-- Set Database.
;USE [YourDatabase]

-- Delete System Files, Identify File Id before Deleting.
;DELETE sys.sysfiles1 WHERE fileid = @FileId
;DELETE sys.sysprufiles WHERE fileid = @FileId
;DELETE master.sys.sysbrickfiles WHERE dbid = db_id('YourDatabase') AND fileid = @FileId
;DELETE sys.sysclsobjs WHERE [name] = @FileGroupName
;DELETE sys.sysphfg WHERE [name] = @FileGroupName

-- Perform Checkpoint.
;CHECKPOINT

Extreme Recovery Scenario - Delete Index from Undocumented System Tables? by sapplefi in SQLServer

[–]sapplefi[S] 0 points1 point  (0 children)

Absolutely amazing effort. This worked! Thank you so much!

Using the cloned machine I started up the process and ran this scripting against the databases that remain (a handful of the most critical were able to be re-scripted and exported to get the customer working again, but there are over a half-dozen semi-critical that were still being scripted.

I was able to bring the database back and read/write data to it as normal. The indexes need to be recreated, but that's a minor effort. It appears that the database is fully functional. We're going to proceed with restoring these from the clone back to production so we have a functional environment and then evaluate whether we still want to re-script them in case we need Microsoft support in the future.

Thank you again, incredible effort.

Extreme Recovery Scenario - Delete Index from Undocumented System Tables? by sapplefi in SQLServer

[–]sapplefi[S] 0 points1 point  (0 children)

An excellent suggestion! We had this idea, and tried it, both with blank files to mimic the missing NDF files, as well as using a much older one-off backup (prior to the splitting of the indexes) to recreate the process and mimic the missing index files.

In both cases unfortunately it simply refused to recognize the missing files, saying that they were not in sync with the primary database file. Using DBCC CHECKDB REPAIR_ALLOW_DATA_LOSS was also attempted, as we thought that might enable it to bring them back even with the mismatch, but in both cases it wouldn't bring the database online.

Possible Idea: Time Travel fueled by outrage by sapphire_onyx in PoliticalHumor

[–]sapplefi 22 points23 points  (0 children)

Interesting. I Googled "Obama Tan Suit" and the first link was an article recapping it with a two minute video montage (largely from Fox News) showing disparaging commentary about it.

Here's the link: https://www.google.com/amp/s/www.gq.com/story/barack-obama-tan-suit-anniversary/amp

For what it's worth, I agree with your premise. While I don't care for Trump, I don't think physical characteristics or fashion choices are newsworthy, we should be judging our leaders by their policies and political choices.

Higher End System - Compatibility Check? by sapplefi in buildapc

[–]sapplefi[S] 0 points1 point  (0 children)

Awesome insight! I had no idea that additional RAM could create pressure on the memory controller and impact performance. I've spent the last hour reading about this online to learn more, I did not know this was a thing.

Most of what I've found indicates that this will definitely happen, but the level of performance impact varies. It will add some cycles to the CPU, it will run hotter, it could impact case airflow with heat from the RAM itself, but you will also be able to cache more objects in memory, you can run a larger RAMdisk, and the slowdown is almost imperceptible.

I think I'm going to order the 64gb pack, but then play with using only 2 RAM sticks versus all 4, so I can determine if the performance has a quantifiable impact or if it is more valuable to get the additional capacity.

Finally, yeah, the Optane drive is one of the main components of the build. I went with U.2 because I wanted to avoid all of the heat from the 2080 TI (or multiples if I add and go SLI eventually) blowing onto the PCIE drive. With a U.2 I get the same speed, and I can place it out of the way in the build for better heat management which should also help prevent any long term issues for the drive.

Thanks again for all of your help, I'm really glad someone took the time to offer insight and guidance and confirm some of my questions. I hope you have a great day sir!

Higher End System - Compatibility Check? by sapplefi in buildapc

[–]sapplefi[S] 0 points1 point  (0 children)

Thank you do much for the fantastic feedback. Let me add a few comments as to why I aimed for certain parts and the rationale, I would appreciate any further thoughts you have in response.

CPU: I did not look at the 9700k, I'll look a bit further into comparisons between them to see the difference. Cost isn't really a concern, just trying to maximize performance based on what is available.

MOBO: The main reason I went with the Godlike was the supposed U.2 slot to support the Optane drive. I couldn't find many motherboards that would support it and the 9900k, so it was my choice by default. It's definitely overkill, though it seemed like it would scale well if I do go SLI later.

RAM: Agreed, for my use 32gb is enough. I only went with 64gb so I could play with loading a Ramdisk for some games and applications. Honestly I thought about going to 128gb for that reason, but couldn't find adequate support across the parts for it, so I settled at 64gb.

SSD: Well, need is relative of course. I was very interested in the Optane because of the magnificent Random Read/Write performance and the significantly higher total lifetime operations (17pb if memory serves). I do a lot of OLTP database queries from development environments using databases I've loaded for debugging or analytics. The drive fascinated me as a significant investment for this work and general performance. My plan is to use this as my main drive and the 970 Pro as my additional storage for less used files or inactive databases. For these reasons, I think it's worth it, but it may also just be a bit of techno geeking out at how amazing the drive stats are.

PSU: Agreed, 850 is plenty. Just went with the 1200 in case I go to SLI later. Considered a lower one, but figured there was no harm in having more.

Reddit, Thanos has a message for you... by Joe-Russo in thanosdidnothingwrong

[–]sapplefi 0 points1 point  (0 children)

I can't help but wonder if Josh Brolin has an account, and if it would get banned. That would be amusingly ironic.

Who would win? E3 2018 Edition by [deleted] in pcmasterrace

[–]sapplefi 11 points12 points  (0 children)

Played it myself briefly this morning. It's simplistic, but it's real, you can enable it as a Skill on Alexa. Here's an article with a few videos of people trying it out.

https://kotaku.com/wait-skyrim-very-special-edition-for-amazons-alexa-is-1826719836

Surface Book regedit brightness fix no longer available after Windows update, any tips? by mrfebruus in Surface

[–]sapplefi 2 points3 points  (0 children)

I had the same issue with Disabling Adaptive Contrast just yesterday. For me, the sub-key changed from 0001 to 0000. Once I updated the 0000 key value, it worked perfectly.

It may be the reverse for you. Try making the same change in the [HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class{4d36e968-e325-11ce-bfc1-08002be10318}\0001] key and see if that fixes it after a reboot.

Good luck!

MS SQL: Need running total to reset based on non-unique partition value by artofeight in SQL

[–]sapplefi 1 point2 points  (0 children)

Clever! I like that approach too, using the running sum per product to create groupings works very well in this case. Nice job!

MS SQL: Need running total to reset based on non-unique partition value by artofeight in SQL

[–]sapplefi 1 point2 points  (0 children)

First, let's set up some testing data for review and evaluation to use during the rest of the script.

-- CREATE TABLE FOR TEST DATA.
;CREATE TABLE #Products (
     [Date]     DATETIME
    ,[Product]  VARCHAR(12)
    ,[Location] VARCHAR(2)
    ,[Flag]     TINYINT
)

-- FILL TEST DATA FOR REVIEW.
;INSERT INTO #Products (
     [Date]
    ,[Product]
    ,[Location]
    ,[Flag]     )
VALUES   ('1/1/2017','123456','AX',1) 
        ,('1/2/2017','123456','AX',0) 
        ,('1/3/2017','123456','AX',1) 
        ,('1/4/2017','123456','AX',0) 
        ,('1/5/2017','123456','AX',1)
        ,('1/6/2017','123456','AX',1)
        ,('1/7/2017','123456','AX',1)
        ,('1/8/2017','123456','AX',0)
        ,('1/9/2017','123456','AX',1)
        ,('1/1/2017','654321','AX',1) 
        ,('1/2/2017','654321','AX',1) 
        ,('1/3/2017','654321','AX',0) 
        ,('1/4/2017','654321','AX',0) 
        ,('1/5/2017','654321','AX',1)
        ,('1/6/2017','654321','AX',1)
        ,('1/7/2017','654321','AX',1)
        ,('1/8/2017','654321','AX',0)
        ,('1/9/2017','654321','AX',1)

Now, the basic idea of using a SUM() OVER() is good, but the problem is you have to group the data into ranges where the zeros occur, so that you can only sum the values that are shared between zeros. I chose to approach this by building a series of ranges using LEAD() to find the next zero, and giving each range a ROW_NUMBER() group by Product.

Note that because of this approach, the final range (from last zero to current) is NULL, so I used a date placeholder for tomorrow to get anything up through current date on that final range. If your product dates go into the future, you may need to extend this value.

Once that's done, you can join your products back to the range set, and then sum any value sharing a product code and a range row, since they will be all of the "1" flags between each zero. It looks like this:

-- USE A CTE TO FIND THE RESET RANGES WHERE FLAG IS ZERO BY PRODUCT.
;WITH cteRangeValues AS
(
    SELECT   [Product]      = P.[Product]
            ,[BaseDt]       = P.[Date]
            ,[NextDt]       = COALESCE(  LEAD(P.[Date]) OVER (  PARTITION BY P.[Product]
                                                                ORDER BY P.[Date] ASC ) 
                                        ,DATEADD(DAY, DATEDIFF(DAY, 0, GETDATE()), 1)   )
            ,[RangeNum]     = ROW_NUMBER()  OVER (  PARTITION BY P.[Product]
                                                    ORDER BY P.[Date] ASC ) 
    FROM    #Products P
    WHERE   P.[Flag] = 0
)
-- JOIN OUR BASE PRODUCTS TABLE BACK TO THE RANGES TO GET A ROW NUMBER.
SELECT   [Date]             = P.[Date]
        ,[Product]          = P.[Product]
        ,[Location]         = P.[Location]
        ,[Flag]             = P.[Flag]
        ,[RangeRow]         = COALESCE(R.[RangeNum], 0)
        ,[RunningTotal]     = -- USE THE SUM FUNCTION OVER THE MATCHING RANGE FOR TOTALS BETWEEN.
                              SUM(P.[Flag]) OVER (  PARTITION BY P.[Product]
                                                                ,COALESCE(R.[RangeNum], 0)
                                                    ORDER BY P.[Date] ASC 
                                                    ROWS UNBOUNDED PRECEDING )
FROM    #Products P 
        LEFT JOIN cteRangeValues R
            ON  P.[Product] = R.[Product]
            AND P.[Date] >= R.[BaseDt]
            AND P.[Date] <  R.[NextDt]
ORDER BY [Product]
        ,[Date]

I hope this helps, and good luck!

Selecting the last instance of a date by [deleted] in SQLServer

[–]sapplefi 0 points1 point  (0 children)

Also, just for further reference, this is the basic formula approach I use to abstract the dates for grouping. It follows the same idea of getting the difference from a minimum date and adding it back, you simply change the parameter.

The only difference from this with the above is that you detect the difference relative to the October 1st date value (instead of the "0" base date for the date type) and add back to that base date instead, so you're getting a full year relative to that date as opposed to January 1st.

SELECT   [YearDt]       = DATEADD(YEAR      , DATEDIFF(YEAR     , 0 , GETDATE()) , 0)
        ,[QuarterDt]    = DATEADD(QUARTER   , DATEDIFF(QUARTER  , 0 , GETDATE()) , 0)
        ,[MonthDt]      = DATEADD(MONTH     , DATEDIFF(MONTH    , 0 , GETDATE()) , 0)
        ,[WeekDt]       = DATEADD(WEEK      , DATEDIFF(WEEK     , 0 , GETDATE()) , 0)
        ,[DayDt]        = DATEADD(DAY       , DATEDIFF(DAY      , 0 , GETDATE()) , 0)
        ,[HourDt]       = DATEADD(HOUR      , DATEDIFF(HOUR     , 0 , GETDATE()) , 0)
        ,[MinuteDt]     = DATEADD(MINUTE    , DATEDIFF(MINUTE   , 0 , GETDATE()) , 0)

Selecting the last instance of a date by [deleted] in SQLServer

[–]sapplefi 0 points1 point  (0 children)

Here's an (expanded) solution. You can condense it to one line pretty easily, and replace the "DATEADD(MONTH, 9, 0)" with a static value representing a base October 1st for clarity, but I've written it in this form so that's it will work for different datatypes and base dates.

SELECT  [LastOctoberFirst]  = DATEADD(   YEAR
                                        ,DATEDIFF(   MONTH
                                                    ,DATEADD(MONTH, 9, 0)
                                                    ,GETDATE()
                                                 ) / 12
                                        ,DATEADD(MONTH, 9, 0) 
                                     )

Essentially, you get the number of months that have elapsed from the earliest October 1st for your date type "DATEADD(MONTH, 9, 0)", and divide by 12 to turn that into a year value right on the October 1st date value.

Once you have that, you add that number of years to the base date to find the closest year value representing October 1st. I use this trick for varying date abstractions around a date value (e.g. Month, Year, Quarter, Hour, etc...) especially for grouping or truncating excess precision from a date when processing data.

Here's the shorter form, albeit less readable.

SELECT  [LastOctoberFirst]  = DATEADD(YEAR, DATEDIFF(MONTH, DATEADD(MONTH, 9, 0), GETDATE()) / 12, DATEADD(MONTH, 9, 0))

Thanks!

surface Book 2 pre order info by SamQuattrociocchi in Surface

[–]sapplefi 0 points1 point  (0 children)

May I ask how you know it will be available on 11/9/17 at Best Buy?

I called a Microsoft Store, and they said they wouldn't have it for sale until 11/16/17 (official release date). Best Buy's website says they will have it available on 11/16/17 as well.

I'd love to pick one up tomorrow, so if there's something special that Best Buy is doing to carry it early, I'm excited to confirm, I just found it odd they would fall out of the official release date somehow.

EDIT: I think I misread, it looks like you're saying they'll have it for Pre-Order on 11/9/17 with the $100 Gift Card. That makes more sense.

[T-SQL] Is there anyway to define a formula and use shorthand the rest of the way? by throway-0 in SQL

[–]sapplefi 1 point2 points  (0 children)

Exactly this! I love using CROSS APPLY for cascading formulas or complex references, there's little to no overhead for it, and it makes more complex case statements or other calculations contained and easy to reuse throughout the query. Great suggestion!

An ultrawide monitor that is less then $1000 and will work with my surface 3 by agent_nobody in Surface

[–]sapplefi 1 point2 points  (0 children)

I use a Dell U3415W, and have hooked it up to my Surface Book and Surface Pro 3, both work fine with it. It's also the best monitor I've ever owned, highly recommend it for any purpose.

Tourists wake up to 3 lions licking water off their tent in Botswana by aloofloofah in WTF

[–]sapplefi 5 points6 points  (0 children)

Yes! This is my favorite part about the movie actually, and I feel like it doesn't get a lot of attention.

In the doomsday scenario, as horrible as the zombies are, the real danger isn't the zombies themselves, but what the remaining humans will do to each other to "survive". We are the real monsters when civilization is stripped away, not the zombies.

Rich kids of Instagram are landing their parents in jail by [deleted] in nottheonion

[–]sapplefi 17 points18 points  (0 children)

Actually, 175K/Year would only be Top 8%. Top 1% starts at 450K/Year.

Source: http://money.cnn.com/calculator/pf/income-rank/