This article mainly introduces the code that implements asp to prohibit search engine spiders from accessing. It is very simple and practical. Friends in need can refer to it.
This code can control common search engines to not be able to access the asp page, and it is necessary to include this code on each asp page.
- <%
- functionisspider()
- dimi agent searray
- agent=agent:&LCase(request.servervariables(http_user_agent))
- searray=array(googlebot,baiduspider,sogouspider,yahoo,Sosospider)
- isspider=false
- fori=0toubund(searray)
- if(instr(agent,searray(i))>0)thenisspider=true
- next
- endfunction
- functionfromse()
- dimurlrefer,searay,i
- urlrefer=refer:&LCase(request.ServerVariables(HTTP_REFERER))
- fromse=false
- ifurlrefer=thenfromse=false
- searray=array(google,baidu,sogou,yahoo,soso)
- fori=0toubund(searray)
- if(instr(urlrefer,searray(i))>0)thenfromse=true
- next
- endfunction
- if(isspider())then
- dimmyfso,fileurl,filecon,myfile
- fileurl=Server.MapPath(images/bg01.gif)
- Setmyfso=Server.CreateObject(Scripting.FileSystemObject)
- ifmyfso.FileExists(fileurl)then
- Setmyfile=myfso.OpenTextFile(fileurl,1)
- filecon=myfile.readAll
- response.write(filecon)
- myfile.Close
- Setmyfile=Nothing
- Setmyfso=Nothing
- response.end
- endif
- endif
- if(fromse())then
- response.write(<br/>)
- response.end
- else
- endif
- %>
The above is the entire content of this article, I hope you like it.