- Mastering Concurrency in Python
- Quan Nguyen
- 482字
- 2021-06-10 19:23:55
The present
Considering the present day, where an explosive growth the internet and data sharing happens every second, concurrency is more important than ever. The current use of concurrent programming emphasizes correctness, performance, and robustness.
Some concurrent systems, such as operating systems or database management systems, are generally designed to operate indefinitely, including automatic recovery from failure, and not terminate unexpectedly. As mentioned previously, concurrent systems use shared resources, and thus they require some form of semaphore in their implementation, to control and coordinate access to those resources.
Concurrent programming is quite ubiquitous in the field of software development. Following are a few examples where concurrency is present:
- Concurrency plays an important role in most common programming languages: C++, C#, Erlang, Go, Java, Julia, JavaScript, Perl, Python, Ruby, Scala, and so on.
- Again, since almost every computer today has more than one core in its CPU, desktop applications need to be able to take advantage of that computing power, in order to provide truly well-designed software.
- The iPhone 4S, which was released in 2011, has a dual-core CPU, so mobile development also has to stay connected to concurrent applications.
- As for video games, two of the biggest players on the current market are the Xbox 360, which is a multi-CPU system, and Sony's PS3, which is essentially a multicore system.
- Even the current iteration of the $35 Raspberry Pi is built around a quad-core system.
- It is estimated that on average, Google processes over 40,000 search queries every second, which equates to over 3.5 billion searches per day, and 1.2 trillion searches per year, worldwide. Apart from having massive machines with incredible processing power, concurrency is the best way to handle that amount of data requests.
A large percentage of today's data and applications are stored in the cloud. Since computing instances on the cloud are relatively small in size, almost every web application is therefore forced to be concurrent, processing different small jobs simultaneously. As it gains more customers and has to process more requests, a well-designed web application can simply utilize more servers while keeping the same logic; this corresponds to the property of robustness that we mentioned earlier.
Even in the increasingly popular fields of artificial intelligence and data science, major advances have been made, in part due to the availability of high-end graphics cards (GPUs), which are used as parallel computing engines. In every notable competition on the biggest data science website (https://www.kaggle.com/), almost all prize-winning solutions feature some form of GPU usage during the training process. With the sheer amount of data that big data models have to comb through, concurrency provides an effective solution. Some AI algorithms are even designed to break their input data down into smaller portions and process them independently, which is a perfect opportunity to apply concurrency in order to achieve better model-training time.
- Advanced Quantitative Finance with C++
- 基于粒計(jì)算模型的圖像處理
- arc42 by Example
- Web開發(fā)的貴族:ASP.NET 3.5+SQL Server 2008
- 基于差分進(jìn)化的優(yōu)化方法及應(yīng)用
- Banana Pi Cookbook
- 深入淺出RxJS
- Oracle Exadata專家手冊
- MATLAB 2020從入門到精通
- 低代碼平臺開發(fā)實(shí)踐:基于React
- Unity 2017 Game AI Programming(Third Edition)
- UML軟件建模
- C語言程序設(shè)計(jì)教程
- 嵌入式C編程實(shí)戰(zhàn)
- Elasticsearch搜索引擎構(gòu)建入門與實(shí)戰(zhàn)