Skip to content

Commit d7d8f81

Browse files
committed
update core2quad post
1 parent 2e23906 commit d7d8f81

File tree

2 files changed

+20
-1
lines changed

2 files changed

+20
-1
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
layout: category-page
3+
category: Electronics
4+
description: "Model aircraft sustained with fixed wings"
5+
---

collections/blog/Research/_posts/2024-09-23-tensorflow-old-pc.md

Lines changed: 15 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,4 +8,18 @@ slider1:
88

99
---
1010

11-
The core2quad processor line is an legendary series on processors released in 2010.
11+
While learning about neural networks and the basics of artificial intelligence, I tried to set up my work environment using some of the computers I had lying around. One of those computers was an old desktop with a Core 2 Quad Q9550 processor from 2008. In its time, it was a very good processor, but trying to perform modern computations in 2024 on this processor was not easy. This wasn’t necessarily because the processor was underpowered compared to more recent ones, but rather that I had issues installing the latest version of TensorFlow because the old Core 2 Quad did not have the necessary instructions to run the software.
12+
13+
As it turns out, each processor has a particular set of instructions it can perform at a very basic level. More modern processors are capable of performing more complicated operations. Perhaps one of the most important is the vector operations present in the AVX instruction set. These allow quick vector operations at a hardware level, greatly increasing the performance of certain computations. Since the old Core 2 Quad didn't have the AVX2 instructions, I had to find a workaround to get TensorFlow to work. As it stood, I could install the library with PIP, but that didn’t mean the library could run. In fact, it would give me an error whenever I tried to import it in a Python script.
14+
15+
Since that didn’t work, I had to find another way to get the library to at least load. According to some Google searches, one solution is to compile TensorFlow from scratch and optimize it for the instructions available on your processor. If you are successful, you obtain a library that works only for your computer but is optimized for its characteristics. I thought to myself, "Hey, cool! If I can get this to work, I’ll have a faster library than what I could get by downloading it through the terminal." Unfortunately, it was actually quite complicated to get the dependencies needed to compile TensorFlow. Once I was able to compile it, it failed in the middle of the process. After seeing the stream of errors, I thought, "Oh, crap, now what?"
16+
17+
Well, it turns out this isn’t the only way to get TensorFlow to work. An alternative solution is to download a file to install the library from someone who has successfully compiled it. These packages come in a WHL format that you can then open with Python via the terminal, allowing you to install it like any other package, except it is done locally on your computer. That sounds good in theory, but in practice, it's somewhat difficult to find the particular file that works on your hardware. This is a bit of a two-fold problem because, first, you have to ensure compatibility with your processor. At the same time, if you want to utilize your graphics card, you have to ensure that the compiled file works with your specific device. Fortunately, the graphics card I was using often wasn’t combined with pre-compiled builds of TensorFlow that included the processor compatible with my hardware.
18+
19+
Eventually, after browsing around on the internet, I found some GitHub repositories that contained these WHL files, which I could install to get TensorFlow to work. They weren’t the most recent versions of TensorFlow, but at least they worked. Unfortunately, this hardware compatibility problem was severe; while I could get a version of TensorFlow that worked with the processor, it was another challenge to find one that also worked with my graphics card, a GTX 750. Granted, this graphics card is old and not particularly noteworthy, but it is still worthwhile for performing neural network computations. The fact that it was an NVIDIA card meant that I could utilize CUDA, which is beneficial because TensorFlow, to some extent, is built around it and greatly accelerates the performance of any training done on a neural network.
20+
21+
Unfortunately, after a while of experimenting with different installation files, I wasn’t able to get the graphics card to work with TensorFlow. Still, I managed to get it running in relatively modern versions. It was a bit disappointing because it would have been nice to fully utilize TensorFlow with all my hardware. However, this was better than not being able to use it at all. In any case, I wasn’t too worried about getting TensorFlow to work because I had successfully installed PyTorch. PyTorch worked just fine out of the box; I didn’t have to fiddle with any complicated compiled files, and my graphics card worked without any issues. In fact, I was even able to use Keras with the PyTorch backend, and that all worked seamlessly. So, for this type of old hardware, especially for neural network training applications, I think it's better to use PyTorch. You can get TensorFlow to work, but it seems much more opinionated regarding the type of hardware it can utilize and is generally messier to deal with.
22+
23+
So, would I recommend using very old hardware—15 years or so after production—to run training algorithms for neural networks? If the projects are simple, then yes, why not? If the hardware works, it’s a fine use of the equipment and is better than letting it sit there and rot. Obviously, if you are dealing with very large and complex networks that need to be trained, then it's not the right tool. But for small pet projects, it’s a good way to make use of equipment that still has some viable lifespan. It works! Granted, this is a bit of a strange use case, trying to use really old equipment for modern processing tasks. However, it’s fairly similar to attempting to use a modern low-powered laptop for these types of computations.
24+
25+
Many laptop processors, especially the low-end variants, do not have the same instruction set as desktop processors from their generation. Therefore, they also suffer from the same problem of not being able to run TensorFlow due to compatibility issues. In those cases, you can use pre-compiled WHL files to install TensorFlow, and it’s a valid option. In fact, the builds that depend only on the processor and do not use the graphics card are actually well suited for laptops because many laptops do not have dedicated graphics cards. Instead, they rely on integrated graphics, which are often incompatible with TensorFlow. So it doesn't matter whether you find a version that works on your hardware if you don’t have a compatible graphics card.

0 commit comments

Comments
 (0)