This article describes JavaScript events in touch screen. Share it for your reference. The specific analysis is as follows:
1. Touch events
ontouchstart
ontouchmove
ontouchend
Ontouchcancel currently supports these 4 touch events, including IE. Since touchscreens also support MouseEvent, their order needs to be noted: touchstart → mouseover → mousemove → mousedown → mouseup → click1
Examples are as follows:
/*** onTouchEvent*/var div = document.getElementById("div");//touchstart is similar to mousedowndiv.ontouchstart = function(e){//The touches attribute of the event is an array, where an element represents a touch point at the same moment, //Therefore you can get each touch point of multi-touch through touches// Since we only have a little touch, we directly point to [0]var touch = e.touches[0];//Get the coordinates of the current touch point, which is equivalent to clientX/clientYvar x = touch.clientX;var y = touch.clientY;};//touchmove is similar to mousemovediv.ontouchmove = function(e){//You can add preventDefault to touchstart and touchmove events to prevent touchDefault to prevent touch, //Browser zoom, scrolling, etc. e.preventDefault();};//touchend is similar to mouseupdiv.ontouchup = function(e){//nothing to do};2. Gesture Events Gestures refer to the use of multi-touch to rotate, stretch and other operations, such as enlargement and rotation of pictures and web pages. A gesture event is triggered when two or more fingers are required to touch at the same time. One thing we need to pay attention to about scaling is the position coordinates of elements: we usually use offsetX, getBoundingClientRect and other methods to obtain the position coordinates of elements, but in mobile browsers, the page is often scaled during use, so will the coordinates of the scaled element change? The answer is that there are differences. Use a scenario to illustrate this problem: After page A is loaded, JavaScript gets the coordinates of the element in the document as (100,100), and then the user enlarges the page. At this time, JavaScript outputs the element coordinates again, and still (100,100), but the response area of the element on the screen will be offset according to the scaling ratio. You can open the brick-playing game demo, and then zoom in after the page is fully loaded. At this time, you will find that even if your finger touches outside the "touch here" area, you can control the ball plate because the area is offset. The offset will always exist unless the page is refreshed or the zoom is restored.
/*** onGestureEvent*/var div = document.getElementById("div");div.ongesturechange = function(e){//scale represents the scaling scale generated by the gesture. Less than 1 is shrinking, and greater than 1 is enlargement. Originally 1var scale = e.scale;//rotation represents the angle of the rotation gesture, value interval [0,360], positive value rotates clockwise, negative value counterclockwise var angle = e.rotation;};3. Gravity sensing Gravity sensing is simpler, you only need to add an onorientationchange event to the body node. In this event, the value representing the current phone direction is obtained from the window.orientation property. The value list of window.orientation is as follows:
0: Consistent with the direction when the page was first loaded
-90: Turned clockwise by 90° relative to the original direction
180: Turned 180°
90: Turned 90° counterclockwise. According to my tests, Android 2.1 does not support gravity sensing yet. The above is the current touch screen events. These events have not been incorporated into the standard, but have been widely used. I have Android 2.1, not tested in other environments.
PS: Here we provide you with an online tool about JS events, which summarizes the commonly used event types and function functions of JS:
A complete list of javascript events and functions:
http://tools.VeVB.COM/table/javascript_event
I hope this article will be helpful to everyone's JavaScript programming.