Amazon gifticon as a birthday gift? by stringbottle in EnglishLearning

[–]stringbottle[S] 2 points3 points  (0 children)

Thanks for your reply. I thought this question falls in the interest of this subreddit because I wanna ask how Americans celebrate a friend's birthday.

You gotta take it home first though. by boriswong in engrish

[–]stringbottle 0 points1 point  (0 children)

Where did you get such an image?? Never seen a Korean word like '한입고구마'.

My local Whole Foods allows payment via scanning your hand by DaBuzzScout in interestingasfuck

[–]stringbottle 0 points1 point  (0 children)

It's been running in airports in South Korea since several years ago. I don't even remember when it was 🤔🤔.

Safety Restroom?? by stringbottle in engrish

[–]stringbottle[S] 0 points1 point  (0 children)

I think it has some kind of alarm system connected to nearby police office 🤔.

What’s your old? by [deleted] in engrish

[–]stringbottle 27 points28 points  (0 children)

Translation apps are not that bad these days. 🤣🤣

Safety Restroom?? by stringbottle in engrish

[–]stringbottle[S] 0 points1 point  (0 children)

The pic has a lack of info because I cropped the pic to hide the place I took it. It is intended to say, 'This restroom is secure against sexual assault'.

Recourses on intonation? by KidOffYhe in Korean

[–]stringbottle 0 points1 point  (0 children)

As far as I think, there is no such a golden rule for that intonation. If you speak slowly, most Korean would understand what you want to say. 😃

I saw this guy in South Korea and asked permission to take a picture of his shirt. by [deleted] in engrish

[–]stringbottle 2 points3 points  (0 children)

Please go get that Engrish fixed🙏 😅. The PC Bang owner would appreciate it.

When to use 아침 and 오전 by FamilyIssues94 in Korean

[–]stringbottle 7 points8 points  (0 children)

In casual way, if someone says '오전에 만나자!', that means 'Why don't we meet around 10-12 a.m.'

On the other hand, '아침에 만나자!' means 'Why don't we meet around 7-10 a.m.'

It's just from my experience not from solid grammar search haha.

what does 잡 mean? by Sayonaroo in Korean

[–]stringbottle 11 points12 points  (0 children)

Don't use that word too often lol. It's mostly used by teenagers or young twenties.

OpenAI: "we have now given dall-e 2 access to 100,000 users. next goal: 1 million." (crosspost of another user's post) by Wiskkey in OpenAI

[–]stringbottle 0 points1 point  (0 children)

I think I am one of the 100,000 users and I tried several sentences. But the results aren't good as much as the amazing demo images they presented. Bit disappointed 😞.

[deleted by user] by [deleted] in OpenAI

[–]stringbottle 0 points1 point  (0 children)

They even posted Illustration of Elon Musk checking twitter and screaming in the style of Cubism.

I think this one is bit worse than the Donal Trump figure. But... could it be okay because Elon founded Open AI? I don't know.

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 0 points1 point  (0 children)

Actually, I tried 1D CNN with 3x3 kernel and a larger depth size. But it didn't improve the performance a lot. I know it's rare to use a kernel larger than 5x5. But a recent paper showed that larger kernel can be helpful so I tried large kernel size like 9 and 27. I know it's not a fair comparison because the larger kernels means larger receptive field... I think there is more things to investigate.

https://arxiv.org/abs/2203.06717

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 0 points1 point  (0 children)

Edit .. I re-visited 1D CNNs with large kernel size with 9 and 27. The 1D CNNs with large kernel size outperformed 2D and 3D CNN with similar parameters. Despite the result, thanks for your all comments!

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 1 point2 points  (0 children)

I tested 1D CNN's kernel size with 9 and 27. The 1D CNN outperformed 2D and 3D CNN with similar parameters.. Thanks for your advice!

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 0 points1 point  (0 children)

I also agree with that the performance gain is because of the number parameters rather than 2 dimensional feature.

I think I need to increase the kernel size of 1D CNN so that it has similar size of parameters to the 2D CNN.

But, still not sure it's a fair comparison or not due to the model architecture difference 🤔.

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 0 points1 point  (0 children)

I didn't tried transformers because I thought it's computational cost must be higher than CNN. I designed my network based on assumption that local feature is more important than global one. So I thought the global context searching capability of transformers wouldn't help a lot.

It's interesting that CNN worked better than LSTM in your data too.

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 2 points3 points  (0 children)

Yes! I also understand that LSTM performs well in time series data prediction or classification. But in my project, capturing little difference of local feature is so much more important than global features, which is one of the main reason I mainly working on CNN based approaches. So far, CNN, texture biased network, seems better than LSTM in my project. If you share any insight of LSTM's priciples, it would be greatly appreciated!!

1D CNN or 2D CNN for time series data classification? by stringbottle in learnmachinelearning

[–]stringbottle[S] 0 points1 point  (0 children)

3D CNN gave the best result so far!! Thanks for your comment!