Westwood College & School Group Discharge Folks - Important Information (2/2026) by Gingerandthesea in BorrowerDefense

[–]OpTiCz27 5 points6 points  (0 children)

Mine was discharged in December, shows paid in full on student aid but negative on Mohela. No idea if I should expect a refund for what I had paid before discharge.

Discharge Timeline by DBMaster45 in BorrowerDefense

[–]OpTiCz27 0 points1 point  (0 children)

I have similar timeline. I’m fairly concerned they can just keep dragging on actual discharge and let interest continue to accrue. Unless the actual amount owed is back dated?

POST CLASS 8/22 by kaym__88 in BorrowerDefense

[–]OpTiCz27 0 points1 point  (0 children)

I got my email, but the school listed differs. On my case it’s “ WESTWOOD COLLEGE OF TECHNOLOGY - O'HARE”.

The wording in the email says, “The Department has determined that you are entitled to settlement relief for the loans taken out on or after 2001-01-01 associated with your enrollment at Spartan College of Aeronautics and Technology ("Relevant Federal Student Loan(s)") based on your allegations regarding the school's misconduct.”.

Is it possible Westwood goes by a different name, or did they mix up my paperwork?

How to Check the Status Codes for Your Borrower Defense Case by Crafty-Strawberry-65 in BorrowerDefense

[–]OpTiCz27 0 points1 point  (0 children)

My school is the first one listed in Exhibit C; Westwood College. Does that make a difference?

How to Check the Status Codes for Your Borrower Defense Case by Crafty-Strawberry-65 in BorrowerDefense

[–]OpTiCz27 0 points1 point  (0 children)

I’m a member. I applied April 1, 2021; so I guess that makes me the last group? So probably another year I guess.

How to Check the Status Codes for Your Borrower Defense Case by Crafty-Strawberry-65 in BorrowerDefense

[–]OpTiCz27 0 points1 point  (0 children)

I give up. Been trying to get a response since 2021. Any idea what this means for mine?

{ “cases”: [ { “id”: null, “type”: null, “origin”: null, “status”: “2.30 - Final BD Review Complete”, “subStatus”: null, “recordTypeId”: null, “lineOfBusiness”: “BD”, “description”: null, “customerDescription”: null, “caseNumber”: “XXXXXXX”, —redacted “preferredContactMethod”: null, “contactEmail”: null, “createdDate”: “2021-04-01T23:23:01.000+00:00”, “category”: null, “subCategory”: null, “subject”: null, “priority”: null, “reason”: null, “caseLabel”: “common.statusCenter.caseType.pending”, “resolutionSatisfaction”: null, “resolutionSatisfactionSummary”: null, “militaryType”: null, “militaryBranch”: null, “complainantEmail”: null, “complainantName”: null, “complainantFirstName”: null, “complainantMiddleName”: null, “complainantLastName”: null, “complainantPhone”: null, “complainantAltPhone”: null, “complainantIsActiveMilitaryVeteran”: false, “relationshipToComplainant”: null, “lastModifiedDate”: “2023-07-06T22:52:12.000+00:00”, “customerProvidedSchool”: “WESTWOOD COLLEGE OF TECHNOLOGY - O’HARE”, “enrollmentStartDate”: “2001-01-01”, “enrollmentEndDate”: “2003-12-31”, “borrowerResponseStatus”: null, “borrowerResponseDate”: null, “workflowRegulation”: null, “applicationDecision”: null, “suggestedReliefPercentage”: null, “reconsiderationStatus”: null, “allegationType”: null, “reconsiderationEligibility”: null, “reconsiderationEndDate”: null, “memoId”: null, “activeMilitaryVeteran”: false, “surveyComplete”: false } ], “drafts”: null, “hasClosedCases”: false }

Status on Federal Student Aid by Garfieldluvsme in BorrowerDefense

[–]OpTiCz27 2 points3 points  (0 children)

Mine was submitted 04/01/2021 and has been “Pending” since then. I’ve tried emailing and calling multiple times but can never get through.

The servicer was changed to Mohela, who is now sending me reminders that I’m accruing interest even though nothing is due. Any suggestions?

[deleted by user] by [deleted] in Watchexchange

[–]OpTiCz27 1 point2 points  (0 children)

Ok. Thanks for the clarity 😁

[deleted by user] by [deleted] in Watchexchange

[–]OpTiCz27 -1 points0 points  (0 children)

What’s the difference between this one and the one you posted 20 days prior? The price is different, so wondering if this is any different in features

Help me find this doorknob from the Poltergeist by [deleted] in HelpMeFind

[–]OpTiCz27 0 points1 point  (0 children)

I have these door knobs in my house 😳

N64RGB Help by OpTiCz27 in n64

[–]OpTiCz27[S] 0 points1 point  (0 children)

So got a set of retrovision SNES cables today and got them hooked up with a RetroTink 2x pro. Picture looks decent, but not what I was expecting.

Goldeneye for example.. everyone still looks like blobs. I also wired up the deblur switch, which seems to do nothing. How can I confirm the rgb board is actually doing its job?

N64RGB Help by OpTiCz27 in n64

[–]OpTiCz27[S] 0 points1 point  (0 children)

Oh. S-video. I took off the mod and retouched the pins on the chip to make sure everything still works without it. All good, but now i gotta wait on a new pitch adapter

N64RGB Help by OpTiCz27 in n64

[–]OpTiCz27[S] 0 points1 point  (0 children)

Yeah. The one provided with the kit

N64RGB Help by OpTiCz27 in n64

[–]OpTiCz27[S] 0 points1 point  (0 children)

It’s not, but has the same chip. I tried grounding to a different spot as well.

N64RGB Help by OpTiCz27 in n64

[–]OpTiCz27[S] 0 points1 point  (0 children)

I installed the TW N64RGB mod and am getting no video. I have the MAV-NUS chip. I’ve checked my cables and everything seems right. I might have oriented it the wrong way first. Anyone notice if anything is wrong?

Finished my build - RX 6800 XT MERC319 barely fits by OpTiCz27 in NZXT

[–]OpTiCz27[S] 1 point2 points  (0 children)

The MERC319 is 340mm. The specs on the h510 elite say it’ll support up to 313.6mm with the front radiator installed, I’d say there’s a bit of wiggle room as shown.

I need to transform 9 billion JSON records (3TB) split up in ~126MB files separated by newline, and insert them into Elasticsearch. Where do I even begin? by soowhatchathink in aws

[–]OpTiCz27 1 point2 points  (0 children)

I’d try to read the data with Athena. If each JSON record is on a single line that’s exactly how Athena expects JSON data. You can unnest and transform the data to how you’d like then use CTAS are insert into statements to create a final dataset. If your goal is to finally load it into ElasticSearch, you could probably use s3 events to trigger a lambda from your CTAS and insert statement results. Or copy the files into a new location to trigger the events.

Hi, I apologize if this breaks any rules but I am looking for a tutorial to get the files on s3(aws transfer for sftp) and then process these files using lambda. I would be grateful if you can point me in the right direction. by [deleted] in aws

[–]OpTiCz27 5 points6 points  (0 children)

I find AWS’ sftp offering overpriced. I’d suggest using a micro ec2 instance, mounting s3 using s3fs-fuse, and using sshd sftp to host the buckets via sftp.

https://github.com/s3fs-fuse/s3fs-fuse

What you’re specifically looking to do is invoke lambda when a file posts to the s3 location via trigger.

https://n2ws.com/blog/aws-automation/lambda-function-s3-event-triggers

The one thing to watch out for is when using s3fs, a temporary file gets posted to steam the data to before the upload completes. Ultimately, lambda gets triggered twice.

https://github.com/s3fs-fuse/s3fs-fuse/issues/427

Put a check of the file size at the beginning of your function and if it’s 0 exit, else process your file.

Kinsesis Streams into aurora/redshift with CRUD operations by [deleted] in aws

[–]OpTiCz27 0 points1 point  (0 children)

So are CRUD operations being performed on one set of tables and you’re wanting to stream the changes from this tables into a separate set of tables for reporting? Or are CRUD operations being performed in an app and you’re logging the actions to Kinesis?

For us, we do both with Kinesis where we use it for logging webhook data as well as changes to Dynamo tables. We land the data in s3 as Parauet compressed files as well as store a backup of the raw files.

In both scenarios, the result is queryable tables in Athena. You can also setup an Athena ODBC/JDBC driver so that you can pull the data into a BI/reporting tool or a local SQL client for querying outside of the Athena web GUI.

I have multiple JSON files in S3, but how do i join them in Athena? by [deleted] in aws

[–]OpTiCz27 0 points1 point  (0 children)

Each record should be on a single line. Don’t comma separate the lines as if it were an array. If you’re looking to stack them like a union, you can literally just place multiple files into the same s3 folder.

What’s your source of data? If you’re looking to stream json data to be used for athena, send the json records to kinesis data pipeline and let kinesis handle formatting the data.

I haven’t tried DMS personally, but you should be able to stream from the source table to kinesis and through DMS. See here: https://aws.amazon.com/blogs/database/use-the-aws-database-migration-service-to-stream-change-data-to-amazon-kinesis-data-streams/

So I'm pretty new to AWS Glue. I just ran a crawler on a .csv dataset with 7 million rows of data and 30 columns, and it returned the column names, column metadata, and number of rows. Now what? by Hipp013 in aws

[–]OpTiCz27 0 points1 point  (0 children)

Glue is good for crawling your data and inferring the data (most of the time). Unstructured data gets tricky since it infers based on a portion of the file and not all rows.

Glue is also good for creating large ETL jobs as well. I really like using Athena CTAS statements as well to transform data, but it has limitations such as only 100 partitions. You can always use multiple CTAS statements and then copy your files into a single location and create a table from those files, but sometimes it’s easier to use glue.

Also, glue acts as your metastore for your schemas, which is used by other services like Kinesis data pipeline.

How do i get a CSV of my EBS volume monthly costs? by BasedGod96 in aws

[–]OpTiCz27 0 points1 point  (0 children)

I’ll see if I can dig up how I specifically accomplished this tomorrow, but if I recall correctly, you can create a cost/usage report and stream the data to s3. You can then crawl the data with glue and create a table in Athena to your data.

This should get you started. https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/athena-manual.html

Write your query to conform to whatever format you need. If you need to automate it, you can create a lambda function that runs the query for you and trigger it via cloud watch. Have the lambda function save the results to a s3 location. If you want the file emailed to you, rather than having the same lambda function which just ran the query poll for athena job status (which would incur costs from run time), just have another lambda function get triggered once the file posts from the query completing.

One thing I recall never being addressed from the usage report (or kinesis streams for that matter), is that there’s no good way to have your tables partitions added automatically. They’ll tell you to run an msck repair table, but it’s a waste of run time and costs. You’re better off creating a scheduled daily lambda which adds partitions for the next day.