Web Scraping with Python: RuntimeError maximum recursion depth exceeded -
i'm trying scape data website updates (here i'm using aapl stock off of yahoo finance). code doesn't run. i've tested every part individually, still error saying "runtimeerror: maximum recursion depth exceeded".
in advance troubleshooting!
import time import lxml, requests bs4 import beautifulsoup url= "http://finance.yahoo.com/q?uhb=uh3_finance_vert&fr=&type=2button&s=aapl" def printpriceaapl(): r = requests.get(url) soup = beautifulsoup(r.content, "lxml") print (soup.find(id="yfs_l84_aapl").string) #id of current stock price time.sleep(60) while true: printpriceaapl()
edit: full error:
traceback (most recent call last): file "c:/users/sjung/documents/printpriceaapl.py", line 15, in <module> printpriceaapl() file "c:/users/sjung/documents/printpriceaapl.py", line 11, in printpriceaapl print (soup.find_all(id="yfs_l84_aapl")[0].string) #id of current stock price file "c:\python27\lib\idlelib\rpc.py", line 595, in __call__ value = self.sockio.remotecall(self.oid, self.name, args, kwargs) file "c:\python27\lib\idlelib\rpc.py", line 210, in remotecall seq = self.asynccall(oid, methodname, args, kwargs) file "c:\python27\lib\idlelib\rpc.py", line 225, in asynccall self.putmessage((seq, request)) file "c:\python27\lib\idlelib\rpc.py", line 324, in putmessage s = pickle.dumps(message) file "c:\python27\lib\copy_reg.py", line 74, in _reduce_ex getstate = self.__getstate__ runtimeerror: maximum recursion depth exceeded
find() searches children recursively. so, find() @ of soup's descendants: children, childrens' children, etc.
each time next level of children examined, "recursion depth" increases run-time stack. if interested in immediate children, find() has recursive=false
argument should examine 1 level down.
alternatively, increase recursion depth be careful. reference post: what maximum recursion depth in python, , how increase it?
for more information on run-time stack is, at: https://chortle.ccsu.edu/assemblytutorial/chapter-25/ass25_9.html
Comments
Post a Comment