pythonpath is specific bash variable used by python interpreteur and compiler to search for modules. Most of time, we need to re-use codes and for it , specific function with desired set of input/output is required.
If you need to send different type of mails( from crontasks, from celery , after login , after invocation of reports api etc) , you can simply import send_email function and reuse it without re-writing email script (import email mode, attach file, fill out sender and reciepient address). For it, neither you have to engage a alltime running api service with all functions active.
Store all scripts in separate folder and add them to pythonpath. Next time, python can find and execute them.
Functions could be rearranged according to need. Now come to main point.
edit your .bashrc file and add this line. export PYTHONPATH=$PYTHONPATH:/mnt/sdb1/mylibs
In this example, my personal library is on another disk and is in mylibs folder.
If you are facing problems like above, timeout may be problem of your network( domain name resolution). Possible solution is to update the network dns . It should be fast dns like Google DNS or Cloudflare DNS
Google DNS – 126.96.36.199,188.8.131.52 Cloudflare DNS – 184.108.40.206
With low latency network and responsive dns, this problem will be resolve as resolved in above image.
This post is also answer to how to install IRkernal for jupyter notebook
Problem :FSADeprecationWarning: SQLALCHEMY_TRACK_MODIFICATIONS adds significant overhead and will be disabled by default in the future. Set it to True or False to suppress this warning. ‘SQLALCHEMY_TRACK_MODIFICATIONS adds significant overhead and ‘
Webpage is not 100% data unlike other data only formats ( excel sheets, json , xml , csv ) etc . It has other goal – present information in visual manner, therefore data is mixed with lot of unwanted content( tags ).
Each and every webpage have some unique design guidelines to present information and carry data. Crawler are not so smart to fetch all information from webpage. Beside fetching data, it is not easy to catagorize and index data correctly all the time. Therefore, there must be some unique method to carry data , without worrying about the html of webpage.