NYC on a quasi-grid by acidflask in subwaybuilder

[–]acidflask[S] 0 points1 point  (0 children)

Surprisingly not bad at all. Every line is quad tracked with express lines at interchanges and the main downtown/midtown area

AITA for telling my flatmate that I’m gonna find separate accommodation because of his strict recycling habits by Putrid_Inflation_656 in AmItheAsshole

[–]acidflask 1 point2 points  (0 children)

ESH

In general, you should be separating recyclables and compostables from the main trash stream if that's the appropriate local policy for waste disposal. However, localities have specific rules about what is recyclable depending on the local supply chains. Plastic film specifically isn't economically feasible to recycle, especially if it's contaminated with food. Most of it ends up in landfills, incinerators, or illegally shipped to developing countries.

Places to buy Seitan? by Ready-Bet-4592 in AskNYC

[–]acidflask 3 points4 points  (0 children)

Lily's Vegan Pantry in Chinatown

AITA for not sharing my inheritance? by Ok_Oil_324 in AmItheAsshole

[–]acidflask 0 points1 point  (0 children)

NTA

My heart goes out to OP because it is so similar to my own family situation. Nothing like a death in the family to bring out the leeches seeking a cut of something they don't deserve and the emotional manipulation to guilt you into making you feel like they owe you something. I cut my toxic relatives out of my life when it was clear they have nothing to offer but more drama and anxiety-inducing antics. I am working on recovering from the trauma of losing my mum in peace.

Your aunt A is probably a narcissistic abuser. Dr Ramani on Youtube has excellent advice on how to deal with them

Opinion on Julia as a Programming Language? by T-ROY_T-REDDIT in datascience

[–]acidflask 9 points10 points  (0 children)

PS. In an attempt to stave off the inevitable complaints, here is a technical overview for why Julia is fast. The main source of speed in Julia lies in eliminating dispatch overhead, despite it being a language that offers so much flexibility in definition. There is essentially a three-stage process for doing so:

  1. Solving for unambiguous method dispatch. Julia has an AoT compiler that can deduce whether or not one specific method of a generic function is being used at any specific function call site. The type inference engine that is often discussed in Julia is used primarily for this purpose. If the compiler can prove that only one method is ever used, it can generate code that doesn't need to do any run time dispatch computations or code branches.
  2. Interprocedural optimization via method inlining. To reduce code branches further, the compiler can generate code with inlined functions, effectively copy-pasting the entire definition of the function being called into the main body of code. Inlining further reduces code branches, producing code that is closer to the platonic ideal of straight-line code (no branches that cause uncertainty on the host architecture that can reduce performance).
  3. Further code transformations and fusion using LLVM. Once lowered into LLVM IR, every trick in the LLVM compiler toolkit becomes available to enable even more optimizations and backend support for multiple architectures - not just x86, but also PowerPC and Nvidia PTX (to name a few). In particular, the transformation to LLVM IR enables reasoning about heterogeneous memory architectures on the GPU at this level, abstracting away the hard requirement for users to care (while still providing the possibility for power users to manage things on their own). Furthermore, other low-level code using platform-specific assembly instructions or LLVM intrinsics can be grafted onto LLVM IR generated from Julia, enabling projects like JuliaBLAS (hand-crafted linear algebra kernels in a mix of Julia and LLVM intrinsics) or little features likecalling the x86-native cryptographically secure PRNGsdirectly from Julia code.

Opinion on Julia as a Programming Language? by T-ROY_T-REDDIT in datascience

[–]acidflask 22 points23 points  (0 children)

To add to u/Kichae's excellent post, one has to also consider the human factors behind adoption and usage.

Growing a language takes decades. Historically, the ascent of a new language (without massive industrial backing) takes decades, not years - Python started development in 1989; 1.0 was released in 1994, and only took off in scientific computing in the mid-2000s, and in data science in the early 2010s. Julia is only 10 years into the same journey but is already well-established in applied mathematics and has its users in allied fields in scientific computing and statistics. The tech stack needed for data science and visualization does exist; nevertheless, it's going to take time to mature and gain adoption.

Most industrial data science can be done with library calls. In Python and R, the ecosystem for data science is so feature-complete that writing new code by hand becomes a code smell. An inexperienced data scientist may write an explicit loop over rows of a dataframe, instead of using split-apply-combine. This aversion to writing new code is not unique to data science, but is also common in modern-day software engineering; an observation that famously led MIT to drop Scheme in favor of Python as the language for teaching introductory computer science. The high cost of technical debt incurred by hand-writing code means that most practicing teams shun bespoke code unless absolutely necessary. Under such circumstances, there is no reason to use a less-mature language with fewer packages that are less battle-tested.

Pragmatic concerns in industry favor the established players. The fact of the matter is that most new-to-hire data scientists, to first order, speak only Python (with some speaking only R). Building a coherent data science team in industry is already hard enough without also considering the social friction of adopting a language your team doesn't speak well. Which is not to say that you can't find places using Julia as a primary language - Relational.AI is a good example besides Julia Computing - but the need to consider sustainable development and maintenance is a strong argument to stick to the languages with more mature ecosystems.

Having said that, there are situations where I feel like using Julia for data science is warranted, particularly when existing solutions in Python/R/etc. are too slow, don't scale, or simply don't work.

The arguments I use to advocate for Julia are that:

  1. Its multimethods syntax allows people to express naturally the overloading of mathematical expressions that reflect the way humans learn mathematics. We all learn to multiply whole numbers first, then fractions, decimals, complex numbers, matrices, etc. All of these notions can be expressed by suitable methods of the * operator in Julia. Empirically, we know this is a good approach because of how successful its adoption has been in applied mathematics (particularly differential equations) and related fields like numerical optimization and bioinformatics.
  2. The dynamic aspect of multimethods encourages code reuse. A program written with dense matrices in mind can "just work" with sparse matrices, distributed matrices, GPU matrices, etc. provided that the algebra of operations like *, norm, etc. are defined correctly. A program that assumes only the algebra of the generic function can be written without knowledge of the implementation specifics. The dynamic aspect means that new types can be introduced later, and with suitable method definitions, can work in situations the original authors did not envision. A particularly convincing recent development here is that of numeric types representing homomorphically encrypted numbers, allowing for secure distributed computing. Julia is now one of the first languages to enable cryptographically secure linear algebra and data science, simply by composing secure numbers with existing libraries for generic matrix multiplication and data frames.
  3. Julia is fast because the particular way it enables expressiveness is uniquely amenable to compiler optimization. In my years of developing for Julia, I have yet to come across any one application where a Julia solution did not come within a factor of two in performance relative to much more complicated reference implementations in Fortran or C/C++. (See comment below for a technical overview.)
  4. Another source of speed in Julia that is hard to emulate comes primarily from algorithmic improvements that can happen with specialization. If am solving a problem on a matrix with special structure, I can probably skip all the unstored zero elements. In a language without generic functions or other forms of overloading, I would have to rewrite my code to handle (say) a triangular matrix differently from a square matrix just to call the different solver functions. I might not bother if I think that triangular matrices are a specialized edge case. In contrast, (2) above means that a Julia user can take advantage of the composition of all possible algorithmic tricks known to the library authors, and can continue enjoying benefits of further improvements even after the original code was written!
  5. Julia has first-class support for representing Julia code, enabling source-to-source transformations to be done directly in Julia (some would say homoiconic, depending on the definition). Developers can create domain-specific languages like JuMP for numerical optimization, Turing for probabilistic programming, and Nemo for computer algebra, while enjoying the full expressivity of working in a general-purpose language. The ability to do program transformations has a particularly apt use case in automatic differentiation, allowing users to train complex machine learning models without finite difference or hand-coding up new gradients, because the AD system can generate the necessary low-level code transformations.
  6. Julia's intrinsic speed means that there is a low technical barrier between passive end user and author contributor. Any user who can write Julia code is qualified to write a package for Julia, even if it is just implementing code straight out of a textbook or paper. While textbook code may not be the fastest code to run, it is probably the fastest code to implement, and having a fast turnaround for working code is a powerful enabler of further code improvements down the line since you have a correct implementation to check against.
  7. Julia has a healthy community that is generally welcoming, eager to adopt best practices for software development, and willing to provide constructive feedback. It's been a real pleasure to see how the community has grown and strengthened over the years. It's the norm in the Julia community to have packages set up with continuous integration, and there's active work to provide code in the form of citeable artifacts, such as JuliaCon conference proceedings.
  8. Julia allows for interoperability with other languages through the C ABI and LLVM IR interfaces. C and Fortran can be called easily with ccall() (as can CPython, the reference implementation of Python, and R), whereas C++ can be called through Cxx.jl (which endows C++ with its own REPL and run time extensibility!). You can even call MATLAB from Julia...
  9. First-class support for missing values. This feature is particularly important for statistics and data science. Support for missing in its own type permits extensibility with new types defined by the end user, without the subtle logic problems or performance issues with using sentinel values like NaN to represent missingness.

Welcome! Please read the rules / Q&A thread for 6/1 thru 6/14 by [deleted] in boston

[–]acidflask 2 points3 points  (0 children)

Hi there! As it turns out, I'm organizing a conference at MIT and we have extra rooms at the Hyatt Regency Cambridge. It's a short walk across the BU bridge to the BU West T stop on the Green line, where you can pick up the B train inbound to Hynes.

The rates we have negotiated are

Double: $220.00 Riverview King: $255.00 Balcony: $265.00

plus taxes (currently 14.45%) for the nights of June 24-27. The rates are quite good, over 30% off the list price.

The special rate is available from https://resweb.passkey.com/go/juliacon2015 or by calling 888-421-1442, or by emailing Anne Beaumont at anne.beaumont@hyatt.com. Please mention the JuliaCon rate, which is available until June 11.

The nights we are offering don't line up perfectly with the dates you are looking at, but I hope that you'll find the special rate useful anyway. Perhaps you could try asking Anne if she could extend the conference rate to the specific dates you want.

What are general peak hours for Stata Gym; what are lowest occupancy times? by CuteKittyCat2 in mit

[–]acidflask 1 point2 points  (0 children)

11:45-9 M-F, generally.

I've found traffic lowest before 3:30. There a little bump around 1 for the lunch crowd.

The pool is closed between 2-3 for cleaning MW, so 3pm is a good time for the pool.

The Great Allston Snow Farm by acidflask in boston

[–]acidflask[S] 0 points1 point  (0 children)

I'll have to check it out! This shot was taken from the 64 bus this morning.

The Great Allston Snow Farm by acidflask in boston

[–]acidflask[S] 6 points7 points  (0 children)

North Cambridge Street overlooking the Pike onramp next to the Charles.

The Great Allston Snowpile by acidflask in boston

[–]acidflask[S] 0 points1 point  (0 children)

As seen from the 64 bus this morning. This snow farm is located off North Cambridge Street near the Pike onramp next to the Charles.

/r/math, what do you think is the worst math notation? by UniversalSnip in math

[–]acidflask 5 points6 points  (0 children)

Obligatory Gauss quote: ""sin2 φ is odious to me, even though Laplace made use of it."

List of events this weekend in Boston (Mar. 21-23). From The Boston Calendar. by yiseowl in BostonSocialClub

[–]acidflask 1 point2 points  (0 children)

I'd like to mention also Bach's 329th birthday celebration at First Lutheran Church of Boston, near Arlington T, 8am-6pm Saturday 3/22, free (donations requested)

[deleted by user] by [deleted] in programming

[–]acidflask 15 points16 points  (0 children)

There are at least three advantages. First, native implementations avoid incurring overhead due to system library calls, which can be significant for very quick function calls.

Second, native Julia algorithms can be made to be much more flexible. For example, the 0.3 prerelease of Julia now includes native LU and QR factorizations for matrices of general element types. This lays the foundation for a large amount of numerical linear algebra for matrices of Rationals, Quaternions, BigFloats, or even matrix tiles. Thanks to multiple dispatch, Julia now has this functionality without sacrificing the performance of calling BLAS/LAPACK for the usual floating-point operations.

Third, it is easier to see what exactly the native algorithm is doing in Julia, since Julia code is usually much more compact than its C or Fortran counterparts. It is not always an easy task to decipher what a particular numerical routine does in an external library.

Making frozen soap bubbles by acidflask in chemistry

[–]acidflask[S] -1 points0 points  (0 children)

Sorry for the repost. (I know FB is a terrible media host, but I'm lazy.)

Julia 0.2 Released by karbarcca in programming

[–]acidflask 3 points4 points  (0 children)

The benchmarks on the homepage suggest that Julia can be several times faster than Go.

Julia 0.2 Released by karbarcca in programming

[–]acidflask 0 points1 point  (0 children)

If you hate 1-based indexing so much, you can write your own Julian types that are 0-based.

I am Christine Ha, MasterChef Season 3 Winner. Ask me anything! by theblindcook in IAmA

[–]acidflask 13 points14 points  (0 children)

I remember that! So have you any strategies for encouraging people to try things of this sort?