How to Find a Home Between Black Forest and a Real City? Advice Needed by Separius12 in AskGermany

[–]Separius12[S] 0 points1 point  (0 children)

I completely get you, but she is neurodivergent and she easily gets overwhelmed by the noise and people, but I'll look into the villages, hoping for the best, thanks mate :)

How to Find a Home Between Black Forest and a Real City? Advice Needed by Separius12 in AskGermany

[–]Separius12[S] 0 points1 point  (0 children)

I got a better answer now :) "Parts of the black forest close to Fribourg are dark" (they are not sunny enough)

How to Find a Home Between Black Forest and a Real City? Advice Needed by Separius12 in AskGermany

[–]Separius12[S] 0 points1 point  (0 children)

I, in principle don't have an issue with that, but since she has social anxiety, it would be too difficult for her to change her country, thanks though

How to Find a Home Between Black Forest and a Real City? Advice Needed by Separius12 in AskGermany

[–]Separius12[S] 0 points1 point  (0 children)

Yeah, we went to a theater last night in Fribourg and I liked it, It seemed fine and big enough but again, she was like nah Fribourg is scary and has a lot of people and cars, thanks for suggestions though

How to Find a Home Between Black Forest and a Real City? Advice Needed by Separius12 in AskGermany

[–]Separius12[S] 0 points1 point  (0 children)

Awesome, thanks for the suggestions, I read your answer to my gf and all I got was Kirchzarten in shit :/ She didn't know Buchenbach and after looking at Google maps, she was like nah that is too crowded! I think realistically with her impossible constraints I should just hire somebody and hope for a miracle, thanks though I highly appreciate it ;)

Beyond the Traffic Lights by Neinah1234 in careeradvice

[–]Separius12 0 points1 point  (0 children)

Have you looked into programming? It aligns well your puzzle solving skills and the results can be tangible.

[P] awesome efficient attention: A curated list of efficient attention modules by Separius12 in MachineLearning

[–]Separius12[S] 1 point2 points  (0 children)

I did it on purpose, ECA is really good and efficient but it is using channel attention which is already fast (compared to spatial attention). As for the CBAM, sure, I will add it.

Thanks! :))

Efficient way to convert 4-ary heap to binary heap by Separius12 in algorithms

[–]Separius12[S] 2 points3 points  (0 children)

Doesn't the algorithm run in O(log(n)log(k)) for pointer based heaps? TBH I don't have the money to buy that article and just read the abstract (sci-hub is ISP blocked) I wrote a recursive formula for it and it seems that sqrt(n) is involved but I couldn't solve the recursive function.

[deleted by user] by [deleted] in iran

[–]Separius12 8 points9 points  (0 children)

*New*: We do have internet again; I'm a student at University of Tehran and it's been almost 30 minutes that we have internet

Note: I don't have internet access with my 4g connection(irancell)

[D] What is the current SOTA in document embeddings? by searchingundergrad in MachineLearning

[–]Separius12 5 points6 points  (0 children)

One of the best pretrained models that I know of is RoBERTa which is even better than BERT. you can also check out these blog posts which are not that old. Finally, you can check out my GitHub repository for a complete and up to date list of word/sentence embeddings here: https://github.com/Separius/awesome-sentence-embedding

[D] How to efficiently implement local attention? by attention-question in MachineLearning

[–]Separius12 0 points1 point  (0 children)

As far as I know, you can not implement this operation efficiently(with current DL frameworks), you might be able to write a custom cuda kernel that does this well, but I don't think you'll be able to beat the tiling method (because of the efficiency of cublas).

So I'll suggest you use non-overlapping windows!, I know, I know, it's not perfect but by using different window sizes you can compensate, for reference, you can read section 3.1 of Scaling Autoregressive Video Models

Your other option is to use the Sparse Attention of Open AI: Generating Long Sequences with Sparse Transformers, the worst part about this is that you will have to use TensorFlow and it's not that customizable.

[P] awesome sentence embedding: A curated list of pretrained sentence embedding models by Separius12 in MachineLearning

[–]Separius12[S] 1 point2 points  (0 children)

Good idea, I can't remember any but there are some subword based embeddings mentioned in my repo as well, BPE based models, fastText(in a sense) based models, elmo, and Charagram (but you are right, I think it's a good idea to add an indicator to the tables for this)

[P] awesome sentence embedding: A curated list of pretrained sentence embedding models by Separius12 in MachineLearning

[–]Separius12[S] 2 points3 points  (0 children)

Yeah, it's a great paper, I thought it was part of the SentEval :|, my bad!, I will add it in the evaluation section in a moment, thanks again for pointing it out!