nagooglesearch
v8.0
다른 Google 검색 라이브러리가 아닙니다. 농담입니다.
교육 목적을 위해 만들어졌습니다. 도움이되기를 바랍니다!
향후 계획 :
pip3 install nagooglesearch
pip3 install --upgrade nagooglesearch다음 명령을 실행하십시오.
git clone https://github.com/ivan-sincek/nagooglesearch && cd nagooglesearch
python3 -m pip install --upgrade build
python3 -m build
python3 -m pip install dist/nagooglesearch-8.0-py3-none-any.whl기본값 :
nagooglesearch . GoogleClient (
tld = "com" ,
homepage_parameters = {
"btnK" : "Google+Search" ,
"source" : "hp"
},
search_parameters = {
},
user_agent = "" ,
proxy = "" ,
max_results = 100 ,
min_sleep = 8 ,
max_sleep = 18 ,
debug = False
) 키워드 google 이없는 도메인 만 키워드 goo.gl 로 끝나지 않는 도메인 만 유효한 결과로 허용됩니다. 최종 출력은 고유하고 정렬 된 URL 목록입니다.
예, 표준 :
import nagooglesearch
# the following query string parameters are set only if 'start' query string parameter is not set or is equal to zero
# simulate a homepage search
homepage_parameters = {
"btnK" : "Google+Search" ,
"source" : "hp"
}
# search the internet for additional query string parameters
search_parameters = {
"q" : "site:*.example.com intext:password" , # search query
"tbs" : "li:1" , # specify 'li:1' for verbatim search (no alternate spellings, etc.)
"hl" : "en" ,
"lr" : "lang_en" ,
"cr" : "countryUS" ,
"filter" : "0" , # specify '0' to display hidden results
"safe" : "images" , # specify 'images' to turn off safe search, or specify 'active' to turn on safe search
"num" : "80" # number of results per page
}
client = nagooglesearch . GoogleClient (
tld = "com" , # top level domain, e.g., www.google.com or www.google.hr
homepage_parameters = homepage_parameters , # 'search_parameters' will override 'homepage_parameters'
search_parameters = search_parameters ,
user_agent = "curl/3.30.1" , # a random user agent will be set if none is provided
proxy = "socks5://127.0.0.1:9050" , # one of the supported URL schemes are 'http[s]', 'socks4[h]', and 'socks5[h]'
max_results = 200 , # maximum unique URLs to return
min_sleep = 15 , # minimum sleep between page requests
max_sleep = 30 , # maximum sleep between page requests
debug = True # enable debug output
)
urls = client . search ()
if client . get_error () == "REQUESTS_EXCEPTION" :
print ( "[ Requests Exception ]" )
# do something
elif client . get_error () == "429_TOO_MANY_REQUESTS" :
print ( "[ HTTP 429 Too Many Requests ]" )
# do something
for url in urls :
print ( url )
# do something max_results (예 : 200 으로 설정되고 num 설정된 경우, 80 , 반환 할 수있는 최대 고유 URL이 실제로 240 도달 할 수 있습니다.
여기에서 사용자 에이전트 목록을 확인하십시오. 더 많은 사용자 에이전트는 Scrapeops.io를 확인하십시오.
예를 들어 최단 : 가능 :
import nagooglesearch
urls = nagooglesearch . GoogleClient ( search_parameters = { "q" : "site:*.example.com intext:password" }). search ()
# do something예를 들어, 6 개월 이상의 결과를 표시하지 마십시오.
import nagooglesearch , dateutil . relativedelta as relativedelta
def get_tbs ( months ):
today = datetime . datetime . today ()
return nagooglesearch . get_tbs ( today , today - relativedelta . relativedelta ( months = months ))
search_parameters = {
"tbs" : get_tbs ( 6 )
}
# do something예를 들어 모든 사용자 에이전트를 얻습니다.
import nagooglesearch
user_agents = nagooglesearch . get_all_user_agents ()
print ( user_agents )
# do something예를 들어, 임의의 사용자 에이전트 받기 :
import nagooglesearch
user_agent = nagooglesearch . get_random_user_agent ()
print ( user_agent )
# do something