Nowadays, virtual hosting website building is popular. I also have a website, and I am also a webmaster. After nearly a year of work, I feel that every time the website program is upgraded, I first go to the official website to read the announcement, then download the upgrade package to the local area, decompress, and upload FTP to the virtual host. These are all tiring physical tasks, and I am too lazy to be very lazy, so I am fantastic and thought it would be great if the program could be automatically upgraded. So I thought about it and wrote this article, hoping it will be helpful to WEB program developers. This is only for ASP, because I only know ASP:-(
Let’s first look at the upgrade process of traditional win32 programs (such as antivirus software). It relies on the software upgrade program to connect to the server through the network to analyze and download the upgrade file to the local area.
The WEB program is a bit different because it runs on the WEB server. It ultimately needs to overwrite the files on the upgrade server to the WEB server, and the webmaster's computer is just a transfer. If you directly copy the files on the upgrade server to the WEB server (without transiting through the webmaster), then automatic upgrade will be achieved.
Fortunately, the system comes with a Microsoft.XMLHTTP component for accessing WEB. It can be called in ASP to connect to the upgrade server to download the upgrade file.
The following code is an example of downloading a file using Microsoft.XMLHTTP:
<%
SetxPost=CreateObject("Microsoft.XMLHTTP")
xPost.Open"GET","http://www.0x54.org/test.exe",False
xPost.Send()
SetsGet=CreateObject("ADODB.Stream")
sGet.Mode=3
sGet.Type=1
sGet.Open()
sGet.Write(xPost.responseBody)
sGet.SaveToFileServer.MapPath("update.exe"),2
setsGet=nothing
setsPOST=nothing
response.Write("Download the file successfully!<br>")
%>
The above code is to save http://www.0x54.org/test.exe to the current directory of the WEB server. As for more usage of Microsoft.XMLHTTP, please check MSDN.
If there are many files, Microsoft.XMLHTTP will be called multiple times to connect to the network, and some files may fail to update in a certain connection. To avoid this, it is best to package all files into one file and download it to WEB one time before unpacking it.
Haha, the packaging mentioned here is not RAR or ZIP package, but we define it ourselves. For example, splice all files into one and then separate them according to special marks. It's not so troublesome now, because there is a ready-made method. We use the method of using it: put all files (binary form) and their path information into the Access database.
The following vbs file (from Ocean Top 2006Plus) is to package all files in the current directory:
Dimn,ws,fsoX,thePath
Setws=CreateObject("WScript.Shell")
SetfsoX=CreateObject("Scripting.FileSystemObject")
thePath=ws.Exec("cmd/ccd").StdOut.ReadAll()&"/"
i=InStr(thePath,Chr(13))
thePath=Left(thePath,i-1)
n=len(thePath)
OnErrorResumeNext
addToMdb(thePath)
Wscript.Echo "The current directory has been packaged, the root directory is the current directory"