Home > Article > Backend Development > What do you need to prepare to learn python?
Learning python requires preparation: 1. Proficient in Python’s development environment and core programming knowledge; 2. Proficient in using Python’s object-oriented knowledge for program development; 3. Have an in-depth understanding of Python’s core libraries and components.
【Related learning recommendations: python tutorial】
Preparations required to learn python:
The first stage: professional core foundation
Stage goals:
1. Be proficient in Python Development environment and core programming knowledge
2. Proficient in using Python object-oriented knowledge for program development
3. Have an in-depth understanding of Python’s core libraries and components
4. Proficient in using SQL statements to perform common database operations
5. Proficient in using Linux operating system commands and environment configuration
6. Proficient in using MySQL and mastering advanced database operations
7. Ability Comprehensively apply the knowledge learned to complete the project
Knowledge points:
Python programming basics, Python object-oriented, Python advanced advancement, MySQL database, Linux operating system.
1. Python programming basics, grammatical rules, functions and parameters, data types, modules and packages, file IO, cultivate solid basic Python programming skills, and be proficient in the programming of Python core objects and libraries.
2. Python object-oriented, core objects, exception handling, multi-threading, network programming, have a deep understanding of object-oriented programming, exception handling mechanism, multi-threading principles, network protocol knowledge, and skillfully apply it in projects.
3. Principles of classes, MetaClass, underlined special methods, recursion, magic methods, reflection, iterators, decorators, UnitTest, Mock. Deeply understand the underlying principles of object-oriented, master advanced Python development techniques, and understand unit testing technology.
4. Database knowledge, paradigm, MySQL configuration, commands, database and table creation, data addition, deletion, modification and query, constraints, views, stored procedures, functions, triggers, transactions, cursors, PDBC, in-depth understanding of databases General knowledge of management systems and the use and management of MySQL database. Lay a solid foundation for Python backend development.
5. Linux installation and configuration, file directory operations, VI commands, management, users and permissions, environment configuration, Docker, Shell programming Linux as a mainstream server operating system is the key point that every development engineer must master technology and be able to use it skillfully.
Second phase: PythonWEB development
Phase goals:
1. Proficient in Web front-end development technology, HTML, CSS, JavaScript and front-end frameworks
2. In-depth understanding of the front-end and back-end interaction processes and communication protocols in the Web system
3. Proficient in using the Web front-end and mainstream frameworks such as Django and Flask to complete Web system development
4. Have an in-depth understanding of network protocols, distributed, PDBC, AJAX, JSON and other knowledge
5. Be able to use the knowledge learned to develop a MiniWeb framework and master the principles of framework implementation
6. Use Web development Framework implementation runs through the project
Knowledge points:
Web front-end programming, Web front-end advanced, Django development framework, Flask development framework, and practical Web development projects.
1. Master the front-end development technology of Web page elements, layout, CSS styles, box models, JavaScript, JQuery and Bootstrap, master the JQuery and BootStrap front-end development frameworks, and complete page layout and beautification.
2. Front-end development framework Vue, JSON data, network communication protocol, Web server and front-end interaction. Proficient in using Vue framework, in-depth understanding of HTTP network protocol, proficient in using Swagger, AJAX technology to achieve front-end and back-end interaction.
3. Customize the web development framework, basic use of the Django framework, Model attributes and backend configuration, Cookie and Session, Templates, ORM data model, Redis second-level cache, RESTful, MVC model to master the Django framework Commonly used APIs, integrate front-end technology, and develop complete WEB systems and frameworks.
4. Flask installation configuration, App object initialization and configuration, view function routing, Request object, Abort function, custom errors, return value of view function, Flask context and request hook, template, database extension Package Flask-Sqlalchemy, database migration extension package Flask-Migrate, and email extension package Flask-Mail. Master the common APIs of the Flask framework, the similarities and differences with the Django framework, and be able to independently develop a complete WEB system development.
The third phase: crawler and data analysis
Phase objectives:
1. Be proficient in the principles of crawler operation and the use of common network packet capture tools, Able to capture and analyze HTTP and HTTPS protocols
2. Proficient in various common web page structure parsing libraries to parse and extract the crawl results
3. Proficient in various common responses Crawling mechanisms and countermeasures can handle common anti-crawling measures
4. Proficient in using the commercial crawler framework Scrapy to write large-scale web crawlers for distributed content crawling
5. Proficient in mastering data Analysis related concepts and workflow
6. Proficient in the use of mainstream data analysis tools Numpy, Pandas and Matplotlib
7. Proficient in data cleaning, organization, format conversion, and data analysis report writing
8. Be able to comprehensively use crawlers to crawl Douban movie review data and complete the whole process of data analysis project practice
Knowledge points:
Web crawler development, Numpy for data analysis, and Pandas for data analysis.
1. Principles of crawling pages, crawling processes, page parsing tools LXML, Beautifulfoup, regular expressions, proxy pool writing and architecture, common anti-crawling measures and solutions, crawler framework structure, commercial crawler framework Scrapy, based on the analysis and understanding of crawler crawling principles, website data crawling processes and network protocols, masters the use of web page parsing tools, can flexibly respond to the anti-crawling strategies of most websites, and has the ability and proficiency to independently complete the crawler framework. Ability to write distributed crawlers using large-scale commercial crawler frameworks.
2. Characteristics of ndarray data structure in Numpy, data types supported by numpy, built-in array creation methods, arithmetic operators, matrix products, auto-increment and auto-decrement, general functions and aggregate functions, slicing Index, vectorization and broadcast mechanism of ndarray, familiar with the common use of Numpy, one of the three most powerful tools for data analysis, familiar with the characteristics and common operations of ndarray data structure, and mastering operations such as slicing, indexing, and matrix operations for ndarray arrays of different dimensions. .
3. The three major data structures in Pandas, including the basic concepts and uses of Dataframe, Series and Index objects, replacement and deletion of index objects, arithmetic and data alignment methods, data cleaning and data regularization, structure Conversion, familiar with the common use of Pandas, one of the three major tools for data analysis, familiar with the use of the three major data objects in Pandas, and able to use Pandas to complete the most important data cleaning, format conversion and data regularization work in data analysis, and Pandas for file processing Read and operate methods.
4. Matplotlib three-layer structure system, drawing of various common chart types such as line charts, bar charts, stacked bar charts, and pie charts, adding legends, text, and markings, saving visual files, and becoming familiar with data Analyze the common uses of Matplotlib, one of the three powerful tools, be familiar with the three-layer structure of Matplotlib, and be able to use Matplotlib proficiently to draw various common data analysis charts. Be able to comprehensively utilize various data analysis and visualization tools taught in the course to complete full-scale practical projects such as stock market data analysis and prediction, data analysis of shared bicycle user groups, and global happiness index data analysis.
The fourth stage: machine learning and artificial intelligence
Phase objectives:
1. Understand the basic concepts and system processing procedures related to machine learning
2. Be able to skillfully apply various common machine learning models to solve supervised learning and unsupervised learning training and testing problems, and solve regression and classification problems
3. Be proficient in common classification algorithms and regression Algorithm models, such as KNN, decision tree, random forest, K-Means, etc.
4. Master the way convolutional neural networks handle image recognition and natural language recognition problems, and be familiar with tensors in the deep learning framework TF , session, gradient optimization model, etc.
5. Master the operating mechanism of deep learning convolutional neural network, and be able to customize the convolution layer, pooling layer, and FC layer to complete image recognition, handwriting font recognition, verification code recognition, etc. Regular deep learning practical projects
Knowledge points:
1. Common machine learning algorithms, use of sklearn data sets, dictionary feature extraction, text feature extraction, normalization, standardization, data principal components Analyze PCA, KNN algorithm, decision tree model, random forest, linear regression and logistic regression models and algorithms. Be familiar with basic concepts related to machine learning, master the basic workflow of machine learning, be familiar with feature engineering, and be able to use various common machine learning algorithm models to solve classification, regression, clustering and other problems.
2. Basic concepts related to Tensorflow, TF data flow diagram, session, tensor, tensorboard visualization, tensor modification, TF file reading, use of tensorflow playround, neural network structure, convolution calculation, activation function Calculation and pooling layer design, master the differences and exercises before machine learning and deep learning, master the basic workflow of deep learning, master the structural levels and characteristics of neural networks, master the use of tensors, graph structures, OP objects, etc. Familiar with the design of input layer, convolution layer, pooling layer and fully connected layer, and completed the whole process of common deep learning projects such as verification code recognition, image recognition, and handwriting input recognition.
If you want to know more about programming learning, please pay attention to the php training column!
The above is the detailed content of What do you need to prepare to learn python?. For more information, please follow other related articles on the PHP Chinese website!