Harness the Power of Web Scraping with Python and Beautiful Soup: A MIDI Music Example
The internet is a treasure trove of information, but accessing it programmatically can be challenging without dedicated APIs. Python's Beautiful Soup library offers a powerful solution, enabling you to scrape and parse data directly from web pages.
Let's explore this by scraping MIDI data to train a Magenta neural network for generating classic Nintendo-style music. We'll source MIDI files from the Video Game Music Archive (VGM).
Setting Up Your Environment
Ensure you have Python 3 and pip installed. It's crucial to create and activate a virtual environment before installing dependencies:
pip install requests==2.22.0 beautifulsoup4==4.8.1
We use Beautiful Soup 4 (Beautiful Soup 3 is no longer maintained).
Scraping and Parsing with Requests and Beautiful Soup
First, let's fetch the HTML and create a BeautifulSoup object:
import requests from bs4 import BeautifulSoup vgm_url = 'https://www.vgmusic.com/music/console/nintendo/nes/' html_text = requests.get(vgm_url).text soup = BeautifulSoup(html_text, 'html.parser')
The soup
object allows navigation of the HTML. soup.title
gives the page title; print(soup.get_text())
displays all text.
Mastering Beautiful Soup's Power
The find()
and find_all()
methods are essential. soup.find()
targets single elements (e.g., soup.find(id='banner_ad').text
gets banner ad text). soup.find_all()
iterates through multiple elements. For instance, this prints all hyperlink URLs:
for link in soup.find_all('a'): print(link.get('href'))
find_all()
accepts arguments like regular expressions or tag attributes for precise filtering. Refer to the Beautiful Soup documentation for advanced features.
Navigating and Parsing HTML
Before writing parsing code, examine the browser-rendered HTML. Each webpage is unique; data extraction often requires creativity and experimentation.
Our goal is to download unique MIDI files, excluding duplicates and remixes. Browser developer tools (right-click, "Inspect") help identify HTML elements for programmatic access.
Let's use find_all()
with regular expressions to filter links containing MIDI files (excluding those with parentheses in their names):
Create nes_midi_scraper.py
:
import re import requests from bs4 import BeautifulSoup vgm_url = 'https://www.vgmusic.com/music/console/nintendo/nes/' html_text = requests.get(vgm_url).text soup = BeautifulSoup(html_text, 'html.parser') if __name__ == '__main__': attrs = {'href': re.compile(r'\.mid$')} tracks = soup.find_all('a', attrs=attrs, string=re.compile(r'^((?!\().)*$')) count = 0 for track in tracks: print(track) count += 1 print(len(tracks))
This filters MIDI files, prints their link tags, and displays the total count. Run with python nes_midi_scraper.py
.
Downloading the MIDI Files
Now, let's download the filtered MIDI files. Add the download_track
function to nes_midi_scraper.py
:
pip install requests==2.22.0 beautifulsoup4==4.8.1
This function downloads each track and saves it with a unique filename. Run the script from your desired save directory. You should download approximately 2230 MIDI files (depending on the website's current content).
Exploring the Web's Potential
Web scraping opens doors to vast datasets. Remember that webpage changes can break your code; keep your scripts updated. Use libraries like Mido (for MIDI data processing) and Magenta (for neural network training) to build upon this foundation.
The above is the detailed content of Web Scraping and Parsing HTML in Python with Beautiful Soup. For more information, please follow other related articles on the PHP Chinese website!

Pythonisbothcompiledandinterpreted.WhenyourunaPythonscript,itisfirstcompiledintobytecode,whichisthenexecutedbythePythonVirtualMachine(PVM).Thishybridapproachallowsforplatform-independentcodebutcanbeslowerthannativemachinecodeexecution.

Python is not strictly line-by-line execution, but is optimized and conditional execution based on the interpreter mechanism. The interpreter converts the code to bytecode, executed by the PVM, and may precompile constant expressions or optimize loops. Understanding these mechanisms helps optimize code and improve efficiency.

There are many methods to connect two lists in Python: 1. Use operators, which are simple but inefficient in large lists; 2. Use extend method, which is efficient but will modify the original list; 3. Use the = operator, which is both efficient and readable; 4. Use itertools.chain function, which is memory efficient but requires additional import; 5. Use list parsing, which is elegant but may be too complex. The selection method should be based on the code context and requirements.

There are many ways to merge Python lists: 1. Use operators, which are simple but not memory efficient for large lists; 2. Use extend method, which is efficient but will modify the original list; 3. Use itertools.chain, which is suitable for large data sets; 4. Use * operator, merge small to medium-sized lists in one line of code; 5. Use numpy.concatenate, which is suitable for large data sets and scenarios with high performance requirements; 6. Use append method, which is suitable for small lists but is inefficient. When selecting a method, you need to consider the list size and application scenarios.

Compiledlanguagesofferspeedandsecurity,whileinterpretedlanguagesprovideeaseofuseandportability.1)CompiledlanguageslikeC arefasterandsecurebuthavelongerdevelopmentcyclesandplatformdependency.2)InterpretedlanguageslikePythonareeasiertouseandmoreportab

In Python, a for loop is used to traverse iterable objects, and a while loop is used to perform operations repeatedly when the condition is satisfied. 1) For loop example: traverse the list and print the elements. 2) While loop example: guess the number game until you guess it right. Mastering cycle principles and optimization techniques can improve code efficiency and reliability.

To concatenate a list into a string, using the join() method in Python is the best choice. 1) Use the join() method to concatenate the list elements into a string, such as ''.join(my_list). 2) For a list containing numbers, convert map(str, numbers) into a string before concatenating. 3) You can use generator expressions for complex formatting, such as ','.join(f'({fruit})'forfruitinfruits). 4) When processing mixed data types, use map(str, mixed_list) to ensure that all elements can be converted into strings. 5) For large lists, use ''.join(large_li

Pythonusesahybridapproach,combiningcompilationtobytecodeandinterpretation.1)Codeiscompiledtoplatform-independentbytecode.2)BytecodeisinterpretedbythePythonVirtualMachine,enhancingefficiencyandportability.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

SublimeText3 Chinese version
Chinese version, very easy to use

WebStorm Mac version
Useful JavaScript development tools

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

Zend Studio 13.0.1
Powerful PHP integrated development environment
