Home  >  Article  >  Backend Development  >  Developers take part in person: Compared with Julia language and Python for machine learning, which one is better?

Developers take part in person: Compared with Julia language and Python for machine learning, which one is better?

WBOY
WBOYforward
2023-04-11 12:28:021231browse

#Which programming language will you choose in 2022?

There has been a saying in the past few years that Julia will replace Python and become one of the new most popular programming languages. We are taking a wait-and-see attitude towards this statement for the time being, but as a powerful tool in scientific computing, Julia's advantages have already emerged, which means that programmers have another choice.

In the fields of data science, artificial intelligence, etc., if we carefully compare Julia and Python, we will find that: the same tasks that Python can achieve, Julia can do, and it is much more efficient and the syntax is concise. Elegant, but not as well-known as Python in terms of spread.

Recently, a hot post on reddit has attracted discussion among netizens. This post mentioned that recently, some developers of Julia language packages discussed the current status of ML in Julia and compared its status with The Python ML ecosystem is compared.

Original post address:

https://www.reddit.com/r/MachineLearning/comments/s1zj44/r_julia_developers_discuss_the_current_state_of /

Jordi Bolibar from Utrecht University believes that "Julia does have great potential for machine learning, but its current status is a bit mixed. More specifically, I insist on using Julia in SciML The main reason is that the DifferentialEquations.jl library works very well, but I haven't found anything similar in Python. However, the real pain for my research is the AD part. Since I started using Julia, I'm in Zygote I encountered two bugs that slowed down my work for a few months. But I still think Julia is the best choice for SciML, but these libraries (and their documentation) should be optimized to be more user-friendly."

Netizen @jgreener64 said: "ML in Julia is very powerful in certain fields. Everything is possible in Julia. The problem faced by Julia is: ML requires a lot of existing knowledge or a lot of time searching/trial and error. On a personal level, I'm currently developing novel differentiable algorithms in Julia."

In addition to the heated discussions among netizens, Julia software package developer Christopher Rackauckas answered the following 7 questions that netizens are more concerned about. Rackauckas is a mathematician and pharmacologist at MIT and the University of Maryland who primarily uses Julia for programming. Rackauckas has opened a dedicated blog for Julia, mathematics, and stochastic biology to introduce related content, and Rackauckas has developed some libraries in Julia, including (but not limited to) DifferentialEquations.jl and Pumas.

##Christopher Rackauckas

Questions include:

  1. Where is ML in Julia really shining today? In what ways will this ecosystem outperform other popular ML frameworks (e.g. PyTorch, Flax, etc.) in the near future and why?

  2. What are the functional or performance shortcomings of Julia’s current ML ecosystem? When will Julia become competitive in these areas?

  3. How do Julia's standard ML packages (e.g. deep learning) compare to popular alternatives in terms of performance (faster, slower, same order of magnitude)?

  4. Are there any important Julia experiments that benchmark popular ML alternatives?

  5. If a company or institution is considering creating a position to contribute to Julia’s ML ecosystem, are there any best practices? Why should they do this? Which contributions have been the most impactful?

  6. Why should independent developers working with other frameworks consider contributing to Julia's ML ecosystem?

  7. What packages do Julia developers tend to use for certain specific tasks? What do Julia developers hope to add that doesn't currently exist?

Below we have selected a few issues that everyone is more concerned about to report:

Question 3: How does Julia perform in "Standard ML"?

Julia's kernel speed is great: on CPU we do very well, on GPU everyone just calls the same cudnn etc; Julia's AD speed is also great. Zygote may have some overhead, but compared to Jax/PyTorch/TensorFlow, Zygote is fast in most cases. Specifically, PyTorch overhead is much higher and cannot even be measured in standard ML workflows. A large enough matrix multiplication will solve the allocation problem or other O(n) problem; Julia does not fuse kernels, so in most benchmarks if the user looks at it, it will see that it does not fuse conv or RNN cudnn calls.

Question 4: What important experiments and benchmarks should we track?

XLA’s distributed scheduler is very good. When we think about scaling, we should ignore PyTorch and think about DaggerFlux and TensorFlow/Jax. XLA has more flexibility to change operations so I think XLA is the winner and we need to use the e-graphs trick to match it. Another thing to note is the "missing middle part in automatic differentiation", which still needs to be solved.

Question 7: What are the recommended software packages?

I tend to use Flux when needed, but everyone should try to use DiffEqFlux. As far as existing kernels go, Flux is the most complete, but its style bores me. I'd like to have a Flux that doesn't use implicit parameters, but explicit parameters. I want these parameters to be represented by ComponentArrays.

The above is the detailed content of Developers take part in person: Compared with Julia language and Python for machine learning, which one is better?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete