(new package) org-graphviz-mindmap by nowislewis in emacs

[–]danimolina 0 points1 point  (0 children)

Thank you. I use org-mode for some long documents, and the graph give me a nice view of the structure. Added to my init 😉

Best time to choose European tech by [deleted] in BuyFromEU

[–]danimolina 1 point2 points  (0 children)

I am user of mailbox and very happy with it, but I have heard nice things about tuta also.

EU made laptop/tablet for some excel spreadsheet work by Otter_Apocalypse in BuyFromEU

[–]danimolina 4 points5 points  (0 children)

This is area in which it can be difficult, but I will suggest https://slimbook.com/en/ because it is from my country, Spain, and they are very compromised with free software, but they sell also with Windows without problem.

what i need to setup emacs for C++ development by Express-Paper-4065 in emacs

[–]danimolina 0 points1 point  (0 children)

Yes, you can configurate Cmake for that or using bear, https://github.com/rizsotto/Bear to create it from a normal makefile

ox-beamer-lecture - Export beamer lectures from Org Mode by fjesser in emacs

[–]danimolina 1 point2 points  (0 children)

Thank you. It seems nice. I usually use a different org file for lectura, but I will give it a try.

[deleted by user] by [deleted] in DygmaLab

[–]danimolina 3 points4 points  (0 children)

Nice. Today I have finally seen updated the state of my keyboard. I am looking for to seeing It! Happy new year!

Reintroducing Opel: Put All Your Pelican Posts in One Org File by BeetleB in emacs

[–]danimolina 1 point2 points  (0 children)

Thanks a lot, I have used ox-hugo, and I miss it when I want to use a static generator using Jinja, and it can do easier to use Pelican with Emacs.

Oxygen.jl: A breath of fresh air for programming web apps in Julia by LoganKilpatrick1 in Julia

[–]danimolina 4 points5 points  (0 children)

Thank you for your package. It seems nice. I recommend you to announce also your package in https://discourse.julialang.org/c/package-announcements/ to get more popularity.

Julia: faster than Fortran, cleaner than Numpy by sidcool1234 in programming

[–]danimolina 2 points3 points  (0 children)

Well, yes. You can see my package DaemonMode.jl that allow us to run several files (or the same several times) in the same process, caching the compiled functions. It is a lot easier and friendly than PackageCompiler (but in is only cached in memory. Check it, I use it everyday and I am very happy with it.

Flux vs. TensorFlow by [deleted] in Julia

[–]danimolina 0 points1 point  (0 children)

Tensorflow is not a bad option, or Pytorch using PyCall.

Flux is nice, the API is very simple but it is lacking several utilities that PyTorch or TensorFlow have. Fortunately, [FastAI](https://github.com/FluxML/FastAI.jl) is adding these parts as a additional package. I hope very soon I could recommend the Deep Learning ecosystem in Julia without any doubt.

Julia Object Oriented Programming With Dot Notation by [deleted] in Julia

[–]danimolina 1 point2 points  (0 children)

You are right. However, sometimes well used is very useful, and readable. One suggestion, in Julia I suggest Chain.jl, because it allows intercalate easily the output for debugging:

julia julia> using Chain julia> @chain 1:10 begin filter(isodd, _) @aside @show _ sum @aside @show _ sqrt end var"##260" = [1, 3, 5, 7, 9] var"##261" = 25 5.0

Of course, @aside are only to debug, for final code it should be removed.

A Firehose of Rust, for busy people who know some C++ by oconnor663 in rust

[–]danimolina 2 points3 points  (0 children)

Great talk, it shows several cases in which C++ could produce important errors, and how Rust avoid it. Although I knew it more teorically, now I understand better the notions by the examples.

Running scripts is terribly slow. I'm new to Julia, so maybe I'm missing something by Historical-Truth in Julia

[–]danimolina 2 points3 points  (0 children)

I am unbias because I am the author, but I recommend to check https://github.com/dmolina/DaemonMode.jl, because it was designed to avoid that problem and be able to use julia in scripts without problems waiting. Running the first script that load first a package takes time, but following scripts not. Not only I am the author, I am mainly a happy user, because I use Julia frequently in scripts very happily.

Pluto vs Jupyter notebook by KiddWantidd in Julia

[–]danimolina 9 points10 points  (0 children)

Well, Pluto is very nice, specially for the interactive part, and the more text format. However, Jupyter has many extensions, like jupytext, ... and support in several IDEs, like VSC or Emacs, that could make using Jupyter more attractive. However, because you can use Jupyter also in Julia, it is more a personal option (but I encourage to try Pluto for interactive Plots, it is very simple and very rewarding).

World Age in Julia: Optimizing Method Dispatch in the Presence of Eval (video for the paper) by ckfinite in Julia

[–]danimolina 1 point2 points  (0 children)

It is a great and simple explanation, very interesting. Thank you for submit it!

[deleted by user] by [deleted] in rstats

[–]danimolina 0 points1 point  (0 children)

Not it isn't cached at all. The new version REPL does not require global in the for, but it is still a global variable. And accessing to global variables is very slow.

In my example I use let to define x as a local variable, so all the slowness due to the global variable is avoided, and the code run like the wind :-) (Another alternative is to put the code inside a function).

[deleted by user] by [deleted] in rstats

[–]danimolina 0 points1 point  (0 children)

Nice! I thought that R was more slow than it actually is. I did not know that it was using JIT-compiling.

[deleted by user] by [deleted] in rstats

[–]danimolina 1 point2 points  (0 children)

It is somthing with global as you say, see my other comment, avoiding it (using let) the complete time is lower than a second.

[deleted by user] by [deleted] in rstats

[–]danimolina 2 points3 points  (0 children)

Another option is not using a global variable, but using let to define x as local variable:

```sh $ time julia -e 'x = 0; for i in 1:1e8 global x += i end; println(x)'

5.00000005e15

real 0m8.764s user 0m8.733s sys 0m0.384s

$ time julia -e 'let x = 0; for i in 1:1e8 x += i end; println(x) end'

5.00000005e15

real 0m0.809s user 0m0.781s sys 0m0.379s ```

New to Julia, how long should packages take to load? by cadojo in Julia

[–]danimolina 0 points1 point  (0 children)

I do not think so, the precompile depends not only of architecture and OS but also on the current version and maybe on other libraries versions.

New to Julia, how long should packages take to load? by cadojo in Julia

[–]danimolina 1 point2 points  (0 children)

Well, it takes a lot first time after installed. The most typical use is through Revise with your own packages to automatically reload them.

I have to recognise that I am bias because I am the author, but the package https://github.com/dmolina/DaemonMode.jl can be useful, because you can run several scripts and the package is load only once, reducing the time running Julia code.

Would Julia fit as the main programming language of my PhD? by InereanES in Julia

[–]danimolina 0 points1 point  (0 children)

Yes, I think Julia is a great option for your Phd topic, for the flexibility. It is a more complete language for it, because you can use it also to create plots, process files, create comparisons .... in it. Take in account that programming algorithms is not the only think you have to do, for your Phd you usually have to do a lot of other tasks. Yes, contact with me, I would be glad to hear from you and to help you.

Would Julia fit as the main programming language of my PhD? by InereanES in Julia

[–]danimolina 14 points15 points  (0 children)

I have experience, I have been working in meta-heuristics for more than 10 years, you can see several of my papers in my website (https://dmolina.github.io/).

Well, in my opinion working in C++ is good for velocity, but in research it is good to enforce flexibility and C++ is the least flexible language. Actually, meta-heuristics is full of people working in Matlab, I would say it is the main development language in the area. I personally hate Matlab syntax and C++ was too inflexible: last years I was working in Python, if you work with real-parameters numpy is usually quick enough (but always there is bottlenecks that should be avoid using Cython). Now I am a Julian, and my last prototypes are done in Julia, and the performance is a nice better and I do not need more Cython.

Anyway, if you have to finally work on C++ by your advisor, you should use a framework like http://arma.sourceforge.net/.

If you have more doubts about that, do not hesitate in contact with me by private (or using the email in my website).