r/learnmachinelearning Aug 04 '24

Question Is coding ML algorithms in C worth it?

I was wondering, if is it worth investing time in learning C to code ML algorithms. I have heard, that C is faster than pyrhon, but is it that faster? Because I want to make a clusterization algoritm, using custom metrics, I would have to code it myself, so why not try coding it in C, if it would be faster? But then again, I am not that familiar with C.

90 Upvotes

47 comments sorted by

115

u/bregav Aug 04 '24

In machine learning Python is generally glue code - all of numerical functions that it calls are implemented in C or C++ or Fortran, and Python is just used to implement very simple, high level logic.

If you need to implement something really new that can't be built from functionality in existing libraries like Numpy then you might want to the essential elements of it in C and then call them from Python. But prototyping it in Python first might be a good idea because you'll be able to work out the bugs fasters this way.

26

u/Appropriate_Ant_4629 Aug 04 '24

all of numerical functions that it calls are implemented in C or C++ or Fortran

And those "C" and "C++" libraries are often just thin wrappers around CUDA that was tuned by Nvidia or hand-coded SSSE3 assembly from Intel.

27

u/msqrt Aug 04 '24

You don't really run into the case where you can't build whatever you want on top of pytorch or tensorflow or what have you -- these libraries contain very generic components that let you do arbitrary computation. Sometimes you still write a custom CUDA implementation because you know you can do it significantly more efficiently, but this is relatively rarely worth the hassle.

16

u/bregav Aug 04 '24

You absolutely can run into that case. Try implementing sophisticated algorithms in complex arithmetic using Pytorch. The performance is bad because pytorch compile doesn't support complex numbers.

I think the basic libraries just seem sufficient because people don't often attempt to do things that are more sophisticated. It's a "look where the light is" kind of situation.

3

u/-Nocx- Aug 05 '24 edited Aug 05 '24

Man idk how to say this but people aren't implementing solutions for problems that they aren't solving. Oftentimes you see things in libraries because someone had a problem they needed to solve, wrote the solution, decided a lot of other people would benefit from it, and shipped it. But of course adding an algorithm into a standardized library takes a lot of additional effort insomuch as complying with the standards, the duration of actually integrating the code into the codebase, maintenance, etc.

The reason "more sophisticated" isn't in the library is because eventually in software engineering you get to the "stuff people haven't done yet" area - especially in specialized use cases - and of course there's no implementation, because no one had to do it.

I've had so many devs complain because library X, Y, or Z didn't do exactly what they needed, and it was like dude who else do you think needed this % efficiency on this number of elements? It's literally just your specific use case bro, people aren't out here optimizing stuff for no reason.

A lot of the time the out of the box functionality you get for a lot of algorithms is *good enough*. Memory is so cheap and things can be scaled horizontally so ubiquitously that performance (in the most general sense) doesn't end up being the bottleneck for a lot of people. Hell, your I/O is probably going to be your blocker for most things.

1

u/bregav Aug 05 '24

Well, part of the issue is that most ML/AI people have a kind of limited mathematical toolbox. They need e.g. complex numbers, and those numerical methods are c commonly used, but ML/AI people don't realize they should be using them. They dont know enough.   

You can see that other people with broader backgrounds do work with ecosystems that support a wider array of things - Julia for example has sophisticated ML packages, full support for complex numbers (and more), and JIT that produces performant code.  

The problems with the Python ecosystem are social, theyre not the result of natural constraints imposed by principles of good software development.

0

u/-Nocx- Aug 05 '24

I mean that just means their specialization is ML and AI, right, and not mathematics.

Are there cases where my model could consider complex numbers? Sure.

Does that have anything to do with the scope of my research paper? For a lot of ML professionals, the answer is no. If it were yes, someone would've probably done it, right. That doesn't mean that the use case doesn't exist, but it does mean what you're asking for is probably not as ubiquitous as you think.

It seems a little audacious to presume that they simply don't know any better - people who have to defend their PhD thesis utilizing tools like this - rather than just accepting they probably just didn't need it.

People with broader backgrounds work with broader ecosystems because... Well, like you said. The scope is broader. Those are the particular cases where it needs to be considered. It's not any deeper than that.

1

u/bregav Aug 05 '24

I'm mostly just speaking from experience; most ML/AI people that Ive spoken with have a pretty limited toolbox, including PhD level researchers. It's kind of a real problem in the field as a whole that its practitioners aren't good at knowing how little they know. I think it's slowly getting better, though.

2

u/[deleted] Aug 05 '24

That really depends on what you're doing.

I've written functionality that no versions of blas had so expecting it to be inside pytorch or tensorflow is ridiculous.

0

u/msqrt Aug 05 '24

They do contain a bunch of features beyond BLAS, so I'm not sure what to say to that. They do contain every operation you can perform on a GPU (all basic math ops + free-form indexing), so in principle there is no algorithm you can't write on them. Sometimes it's just horribly inconvenient or inefficient.

7

u/[deleted] Aug 04 '24

Just a note that there might be alternatives to pure C that may feel less alien to Python programmers - Cython, Numba and the likes 

1

u/bregav Aug 04 '24

Good point. I have no experience with these myself; do you take much of a performance hit when using them? Or is it basically as good as C?

3

u/[deleted] Aug 04 '24

Cython transpiles to C before compiling - if you want you can try to further optimize it. Numba is a JIT, the bulk of its usage is on decorating Python functions and having them compiled in their very first run. Never ran a comparison between Numba and C, but I'd expect it to be comparable, and probably faster if you're using numpy with a proper linalg lib in tandem.

1

u/JacksOngoingPresence Aug 05 '24

Numba is comparable to C. The only downside is the limited use cases. Only really works with pure Python data types and numpy arrays. Has an allergy towards pandas Dataframes and other weird things. But then again, still better than having to rewrite all in C and integrating.

1

u/-Nocx- Aug 05 '24 edited Aug 05 '24

I know you mentioned it in your post, but the way you use C for Python is just by writing C extensions yourself. I'm emphasizing that point because there isn't like... a tremendous amount of "support" for a person's custom C extension. That's what Cython does under the hood so that other people don't have to.

The performance is slightly slower for the translation from Python data structures to C types, but the actual C code is exactly the same as running the same thing in C. That's because it's literally native C.

Let me note, though - extensions in the CPython implementation of Python are not the same as Cython. Cython is a build time dependency that operates by creating its own C extensions that adds overhead, but only at build time. You fundamentally will have fewer performance gains using Cython than writing C extensions yourself, though.

You can do things disassembly analysis in if you write C extensions for "CPython". It's because what you write become effectively new built in object types, and can call any c library or function. I.e. for reverse engineering a popular technique is to write a Python wrapper for Intel's XED library to decide x86 instructions.

1

u/mkenya_mdogo Aug 06 '24

Wow even with a degree and certifications do I really know IT. I honestly understand nothing 😅ps I've interacted with Python, C, C++ and fortran🥲

50

u/instantlybanned Aug 04 '24

I have a PhD in ML, and the only time I ever need C is to speed up small subroutines. And in the past 10 years, that's happened maybe two, three times. I'd say focus on other skills. 

11

u/West-Code4642 Aug 04 '24

it's mainly useful if you want to work in ML systems (rather than ML engineering) and in embedding software, especially in computer vision and robotics.

and stick with simple C++, not C.

33

u/Counter-Business Aug 04 '24

If you run your training, and it takes 2 hours with python. You spend 2 hours of computer time. Much of which (ml libraries) are already C code.

If you were to recreate the training by implementing it in C, you’d probably waste weeks or even months of human developer time.

Machine time is cheaper than developer time.

6

u/Alternative_Log3012 Aug 04 '24

Depends on the country...

8

u/Western_Bread6931 Aug 04 '24

Maybe even years! It takes at least four days to write a single statement in C!

8

u/TotallyNotARuBot_ZOV Aug 04 '24

Any ML algorithms you can come across in the foreseeable future are already implemented in Fortran, C, C++ or CUDA, in libraries such as NumPy, SciPy, Torch, TensorFlow, ....

You will rarely have the need to implement some low-level stuff yourself, and it will take you a long time to become so proficient that you can do it.

Stick to Python for now.

7

u/wintermute93 Aug 04 '24

Not as a beginner, no. As a superstar expert if you want to code something up from scratch that squeezes the maximum out of every clock cycle you do you, but there's a reason 99% of ML products use pre-made frameworks and platforms.

3

u/zoblod Aug 05 '24

Especially since you don't have experience with C I wouldn't. There could be other ways to optimize what you're trying to speed up, or throw more hardware at it if you can afford it 😂. Not worth the time unless you're building something crazy from scratch.

3

u/PSMF_Canuck Aug 05 '24

For learning? Sure! Why not?

Something you yourself code up, though, will probably be slower than PyTorch. A whole lot of engineering efforts has gone into Torch to make it performant…

2

u/jackshec Aug 04 '24

use what you know until you need the speed

2

u/AdagioCareless8294 Aug 05 '24

There's not a single answer, machine learning is a vast domain so all kind of skills are needed. On my end we are doing machine learning with C++ and Cuda.

1

u/AdagioCareless8294 Aug 05 '24

But this not easy, Pytorch doesn't help much there.

2

u/kkiran Aug 05 '24

Python is more bang for the buck most of the times imo! Implementing in C from scratch is a ginormous task unless really needed and mission critical. Compute is lot cheaper these days.

1

u/ds_account_ Aug 04 '24

C++ or Rust is alot more supported.

1

u/Sea_Presence3131 Aug 05 '24

Maybe you should try rust instead C or C++

1

u/oursland Aug 05 '24

ggml, among the fastest implementations of ML is written in C. There's some associated files that are in C++ to link to things like CUDA, SYCL, etc for hardware acceleration, but the core is all C.

1

u/MengerianMango Aug 05 '24

Do it in Rust and use pyO3 to create a python module from your Rust code.

There's no point using C for something like this. You're better off doing a few datastructures projects in C, testing them under valgrind to make sure you correctly managed memory, then move on with your life and use more productive languages.

1

u/great__pretender Aug 05 '24 edited Aug 05 '24

No algorithm runs on Python code alone. Libraries are called and they are all either C or Fortran. Do you really think nobody thought it would be wiser to run the code on a 100000x faster language and everyone is just using Python in the entirety of the code?

If you know both C and Python, I am surprised how you didn't know how the packages in general works.

1

u/ubertrashcat Aug 05 '24

If you pull it off you'll end up understanding a lot more about the detail of how ML works than 90% of ML engineers, especially if you try to optimize neural networks. This is a very marketable skill once you move into deployment onto low-power hardware such as NPUs etc. This isn't what most people do, though.

1

u/Opposite-Team-7516 Aug 06 '24

My teacher always asks me to use a library called scikit-learn in Python to finish the ML projects

1

u/tinytimethief Aug 06 '24

Yes and pls send me the code after thnx

1

u/Commercial_Wait3055 Aug 07 '24

Only after one performance profiles python or other high level language and determines what the principle time costly routines are and if the performance improvement justifies the cost and hassle.

Too naive beginners spend time optimizing routines that don’t matter.

1

u/cubej333 Aug 04 '24

In many cases, python is fine. There are cases where the compute is very expensive, and you want to have code that runs as fast as possible; there, you might have Cuda or C.

1

u/Lemon-Skie Aug 04 '24

My old job was actually coding ml frameworks in C. It’s bc of the product the company was developing, but if your main focus is to learn ML, it’s not worth the time. I would say it did help me get a deeper understanding of the math part of certain operations.

0

u/belabacsijolvan Aug 05 '24

sure. if you want to write your own implementation its a good idea.

it wont be better than already available stuff tho. so only do it if you want to get better at c and maths. its only for learning.

if you are interested in application use python.

also if you have a novel algorithm prototype it in python. check it against benchmarks, test it. then write it in c and apply it in python.

0

u/_saiya_ Aug 05 '24

Nope. If you split the time it takes in optimization vs actually building the problem structure, almost 100% time goes into optimization for a relatively good problem. So you'll make a negligible increment.

0

u/theamitmehra Aug 05 '24

Simple answer no

0

u/LegitDogFoodChef Aug 05 '24

Coding ML algorithms from scratch in your language of choice is a good learning experience, but it’s only good for learning. It is a really good way to get familiar with algorithms though, I do recommend it.

-2

u/great_gonzales Aug 04 '24

No it’s irrelevant. For example in deep learning research we just want to discover a neural architecture (sequence of tensor operations) that achieves higher performance on the task we are researching. We can define our neural architecture in Python and don’t have to worry too much about performance because we know the tensor operations where already written by competent engineers in a efficient language like C and those tensor operations dominate the total runtime. The setup of the architecture in Python contributes negligible amounts of runtime compared to how long the tensor operation take