這篇文章帶給大家的內容是關於Python多線程中線程間資源共享和常用的鎖機制的介紹,有一定的參考價值,有需要的朋友可以參考一下,希望對你有所幫助。
本文將簡單介紹多執行緒程式設計中的執行緒間資源共享和常用的鎖定機制。
在多執行緒程式設計中,常常會牽涉到執行緒間的資源共享, 常用資源共享常用方式:
全域變數(global)
queue(from queue import Queue)
#常用的資源共享鎖定機制:
from threading import Thread, Lock
lock = Lock()
total = 0
'''如果不使用lock那么,最后得到的数字不一定为0;同时loack不支持连续多次acquire,如果这样做了的后果是死锁!'''
def add():
global total
global lock
for i in range(1000000):
lock.acquire()
total += 1
lock.release()
def sub():
global total
global lock
for i in range(1000000):
lock.acquire()
total -= 1
lock.release()
thread1 = Thread(target=add)
thread2 = Thread(target=sub)
# 将Thread1和2设置为守护线程,主线程完成时,子线程也一起结束
# thread1.setDaemon(True)
# thread1.setDaemon(True)
# 启动线程
thread1.start()
thread2.start()
# 阻塞,等待线程1和2完成,如果不使用join,那么主线程完成后,子线程也会自动关闭。
thread1.join()
thread2.join()
total
from threading import Thread, Lock from queue import Queue def add(q): if q.not_full: q.put(1) def sub(q): if q.not_empty: recv = q.get() print(recv) q.task_done() if __name__ =='__main__': # 设置q最多接收3个任务,Queue是线程安全的,所以不需要Lock qu = Queue(3) thread1 = Thread(target=add, args=(qu,)) thread2 = Thread(target=sub, args=(qu,)) thread1.start() thread2.start() # q队列堵塞,等待所有任务都被处理完。 qu.join()
Lock 會降低效能。
from threading import Thread, Lock lock = Lock() total = 0 '''如果不使用lock那么,最后得到的数字不一定为0;同时lock不支持连续多次acquire,如果这样做了的后果是死锁!''' def add(): global total global lock for i in range(1000000): lock.acquire() total += 1 lock.release() def sub(): global total global lock for i in range(1000000): lock.acquire() total -= 1 lock.release() thread1 = Thread(target=add) thread2 = Thread(target=sub) # 将Thread1和2设置为守护线程,主线程完成时,子线程也一起结束 # thread1.setDaemon(True) # thread1.setDaemon(True) # 启动线程 thread1.start() thread2.start() # 阻塞,等待线程1和2完成,如果不使用join,那么主线程完成后,子线程也会自动关闭。 thread1.join() thread2.join() total
因為可以連續取得鎖,所以實作了函數內部呼叫帶鎖的函數
from threading import Thread, Lock, RLock lock = RLock() total = 0 def add(): global lock global total # RLock实现连续获取锁,但是需要相应数量的release来释放资源 for i in range(1000000): lock.acquire() lock.acquire() total += 1 lock.release() lock.release() def sub(): global lock global total for i in range(1000000): lock.acquire() total -= 1 lock.release() thread1 = Thread(target=add) thread2 = Thread(target=sub) thread1.start() thread2.start() thread1.join() thread2.join() total
wait()方法釋放鎖定,然後阻塞,直到另一個執行緒透過呼叫notify()或notify_all()喚醒它。一旦被喚醒,wait()重新獲得鎖定並返回。也可以指定超時。
先啟動wait接收訊號的函數,處於阻塞等待狀態,再啟動notify的函數發出訊號
from threading import Thread, Condition '''聊天 Peaple1 : How are you? Peaple2 : I`m fine, thank you! Peaple1 : What`s your job? Peaple2 : My job is teacher. ''' def Peaple1(condition): with condition: print('Peaple1 : ', 'How are you?') condition.notify() condition.wait() print('Peaple1 : ', 'What`s your job?') condition.notify() condition.wait() def Peaple2(condition): with condition: condition.wait() print('Peaple2 : ', 'I`m fine, thank you!') condition.notify() condition.wait() print('Peaple2 : ', 'My job is teacher.') condition.notify() if __name__ == '__main__': cond = Condition() thread1 = Thread(target=Peaple1, args=(cond,)) thread2 = Thread(target=Peaple2, args=(cond,)) # 此处thread2要比thread1提前启动,因为notify必须要有wait接收;如果先启动thread1,没有wait接收notify信号,那么将会死锁。 thread2.start() thread1.start() # thread1.join() # thread2.join()
#Semaphore 是用于控制进入数量的锁 #文件, 读、写, 写一般只是用于一个线程写,读可以允许有多个 import threading import time class HtmlSpider(threading.Thread): def __init__(self, url, sem): super().__init__() self.url = url self.sem = sem def run(self): time.sleep(2) print("Download {html} success\n".format(html=self.url)) self.sem.release() class UrlProducer(threading.Thread): def __init__(self, sem): super().__init__() self.sem = sem def run(self): for i in range(20): self.sem.acquire() html_thread = HtmlSpider("https://www.baidu.com/{}".format(i), self.sem) html_thread.start() if __name__ == "__main__": # 控制锁的数量, 每次同时会有3个线程获得锁,然后输出 sem = threading.Semaphore(3) url_producer = UrlProducer(sem) url_producer.start()
(三)簡單介紹多進程編程
#
以上是Python多執行緒中線程間資源共享和常用的鎖機制的介紹的詳細內容。更多資訊請關注PHP中文網其他相關文章!