Research on Algorithm of 2D and 3D Interactive Virtual City System

. The integration of 2D GIS and 3D GIS is an important application of the current virtual city system. This paper constructs a virtual city system that integrates 2D GIS and 3D GIS. On this basis, research and achieve on visualization interactive of 2D GIS and 3D GIS. The interactive algorithm is proposed based on the assumption of parameters, so the viewpoint position of the 3D scene can be located according to the 2D scene viewpoint parameters, the viewpoint position of 2D scene can be located according to the 3D scene viewpoint parameters as well, and the position synchronization is maintained in both views through event trigger mechanism during interaction. When user manipulates one viewpoint in 2D GIS or 3D GIS, the others moves synchronously to the same area and angle. Three different views are used as contrasts to verify the accuracy of the algorithm. The implementation results show that the algorithm of synchronization can adapt to various user operations, with feasibility and accuracy.


1INTRODUCTION
After more than 20 years of development and practical application, the 2D GIS system has strong capabilities in data presentation and spatial analysis. On the other hand, with the development of 3D virtual reality simulation software, virtual city systems based on various 3D engines have emerged in recent years. This enables users to more intuitively observe and deal with problems in the 3D space. After many practices, they are widely used in urban construction, land, surveying, emergency, public security, electricity, and gas. In practical applications, the integrated system that combines the advantages of 2D GIS and 3D GIS is a mainstream solution in the current context, and it is also a hot topic in the field of GIS research. In the existing technical solutions, the 3D scene is mainly used for spatial display, which representing the visual effects of the spatial features and colourful appearance of the urban features; the 2D scene is used to display the planar layout of the city, and provides functions of spatial query, statistics, analysis functions and so on.
At present, the integration system of 2D GIS and 3D GIS is widely used in the virtual city system. The papers [1][2][3] realized the integration systems in different fields, but did not consider the relationship between the two modules. The papers [4][5][6] proposed the requirements and ideas of interaction for integrated system, but there is no in-depth analysis on the interactive algorithm's implementation, the accuracy and positioning of the interactive in different situations is not considering. The consistency interactive of the 2D GIS and 3D GIS includes two levels: First, the geographic coordinates of 2D scenes and 3D scenes correspond one to one, and the second is the consistency of the results of the analysis of 2D scenes and 3D scenes, and various spatial feature models can correspond to layer data of 2D scenes. Based on this, the content of the 2D&3D interaction is divided into the visual level and the data spatial analysis level. The research of this paper focuses on the visual level. Based on this, a 2D&3D interactive algorithm is proposed. When the 2D scene is operated, the geographic coordinates, viewpoint angle and viewpoint height are transmitted to the 3D scene window, the geographical coordinates of the center point of the 2D scene are matched with the spatial coordinate positions of the 3D scene. When the 3D scene is operated, the ground coordinates and the elevation of the viewpoint are transmitted to the 2D scene window. Through the event triggering mechanism during interaction, the position changing in both views is synchronized. The algorithm is applied in a virtual city system based on an open source 3D virtual scene engine and a 2D GIS engine.

The basis of the virtual city system
With the development of GIS, various kinds of 2D&3D GIS software emerge one after another in domestic and foreign, and most of them have secondary development interfaces. Usually, there are two ways of secondary development: one is based on the scripting language provided by GIS software, such as MapBasic in MapInfo platform, ArcInfo platform based on VBA. Although this kind of development method has the advantages of quickness and simplicity, but it cannot meet the more needs of the users in terms of scalability and efficiency. The other is to use the corresponding development language (VB, C++, C#, or JavaScript, etc.) to obtain the GIS software interface for secondary development. This article uses the latter approach.
Currently common 3D GIS software includes GoogleEarth, Virtual3D, and Skyline etc. 2D GIS software includes ArcGIS, QGIS, and SuperMap etc. These software platforms have their own advantages and application fields. Considering the difficulty of development and the license of open source, the 2D&3D interactive system studied in this paper builds its own 2D GIS system and 3D GIS system based on OpenLayers and OpenSceneGraph (OSG) which have open source interface. The program framework which constructed by C++ calls the relevant interfaces of engine API, and fully correspondence between 2D vector maps and 3D scenes are implemented.

Scene construction in virtual city system
To achieve an integrated 2D&3D system, we must first ensure that they are based on the same GIS data sources. The data of the 2D system is mainly composed of vector data such as points, lines, planes etc. and raster data such as orthophotograph, which is a flat representation of the spatial world. The 3D system is a three dimensional representation of the real world and has spatial continuity. Currently, most 3D systems use 2D data overlay the digital elevation model (DEM) to display, integrate three dimensional architectural models and other data also. When constructing a three dimensional virtual city, the 3D geometric model of the feature elements (buildings, roads, bridges, rivers, etc.) in the target area is established by means of CAD software. These models are overlaid with the DEM data which have correct geographic position. Then the satellite imagery is pasted with corresponding DEM and real texture of 3D model. The effect is shown in Figure 1.

The principle and approach of 2D&3D interactive
The planar map in 2D scene is a projection of points on the surface of the earth to a plane in accordance with certain mathematical rules and comprehensive rules. The geographical distribution and composition of objects (city, architecture, functional objects, landscapes, etc.) are represented by visualized symbols. Since map projection is a process of mapping spherical coordinates to rectangular coordinates, it will produce certain of deformation. For a certain range of areas, because it uses a strict mathematical model, it can perform various mathematical transformations and operations, such as distance, statistical data, analysis relationships, etc. The 3D scene display system uses computer graphics technology to project the 3D space on the screen according to the visual principle, with a sense of space, sense of distance, and perspective characteristics that are close to large and far to little, thus simulating the effect similar to human vision [7] . Because the mathematical principles of 2D plane maps and 3D space scenes are different, the former usually uses a constant scale, and the distance within the screen display range multiplied by the coefficient is equal to the real distance; while the 3D scene with perspective projection has a variable scale, especially when the pitch angle of the viewpoint approaches zero, that is, the line of sight is approximately in parallel with the ground, the scene range in the 3D scene can be displayed infinity, which brings difficulties to the expression of the range in 2D view.
From the perspective of user visibility, the 2D&3D interactive system refers to when one viewpoint in 2D view or 3D view is operated, and the other view is synchronously moved to the same area and angle. In order to make the geographical coordinates in the 2D map correspond to the spatial positions of the 3D virtual city scene, the two scenes needs to share the same coordinate space, and through the coordinate conversion mechanism, the geographical coordinates of the centre point in 2D scene and the spatial coordinate position in 3D scene could be corresponded. Through the event triggering mechanism during interaction, the position's synchronization in the two views are maintained, so that when the 2D scene is operated, the geographic coordinates, viewpoint angles, and viewpoint heights are transmitted to the 3D view window. When the 3D scene is operated, the geographic coordinates and viewpoint elevations are transmitted to the 2D view window. These are crucial technologies for achieving 2D&3D interactive from the visual level [8] . Therefore, the approach of the 2D&3D interactive studied in this paper is shown in Figure 2.

The architecture of 2D&3D interactive system
The implementation of virtual city system is based on a flexible configurable architecture. In order to reduce the degree of coupling between modules, each functional module of the system exists as a plug-in and cooperates with the system MainFrame through a communication protocol. The flexible architecture has features such as high software reusability, modularity and encapsulation, flexible scalability, and portability. The system uses the defined communication interface as the bridge between the software MainFrame and each sub-function module. Each function module acts as a plug-in, following a defined communication protocol. The functional modules in this article mainly relate to 2D virtual cities and 3D virtual cities. 2D virtual city systems are built using HTML and JavaScript, and based on OpenLayers which is open-source engines. Through the organization of data content such as maps, tiles, and image annotations, a 2D urban system is realized. The 3D city system using a self-built 3D feature model, combined with DEM (Digital Elevation Model) and satellite imagery, and organized in the OSG rendering engine, implemented by C# and OpenGL. The overall system architecture is shown in Figure 3.

Viewpoint parameters
The crucial problem to be solved in the 2D&3D interaction is to precisely correspond the display range of 2D scene and 3D scene, but the difference in the imaging principle determines that the display range cannot be completely consistent, and the positioning parameters of viewpoint are also different. Based on this, a kind of rule and parameter agreement is formulated in process of interactive algorithm research. In 3D scene, the viewpoint is above the ground plane, the pitch value ranges from 0 to 90 degrees, which is 0 degrees from the flat view, and 90 degrees from the top view; the visual range of the view (FOV) is 60 degrees; the centre point of the 2D plan map corresponds to the centre point of the 3D scene. The 2D map display and position require three parameters (Lon, Lat, H): the latitude and longitude of map centre; the view height H. The 3D scene display and position needs six parameters (x, y, z, yaw, pitch, roll, FOV), those parameters agreed by the system are as follows: the viewpoint is fixed in the direction of the true north, that is, yaw=0; the pitch angle pitch=45°; the view point rolling angle=0 along the central axis; and the view angle range FOV=60°.

Coordinate system and coordinate conversion
The commonly used coordinate systems of GIS are the reference-ellipsoid-centric coordinate system, the geocentric coordinate system and the plane projected coordinate system. With the popularization of GPS technology, the geocentric coordinate system represented by the WGS-84 coordinate system has been widely used in the GIS system [9] . In this paper, based on WGS-84 coordinate system, in order to facilitate calculation, the geocentric geodetic coordinate system of WGS-84 needs to be converted into a geocentric spatial Cartesian coordinate system. The conversion methods are shown in formula (1) and (2). The geodetic coordinates are converted into spatial Cartesian coordinates: Where e is the First Eccentricity of Ellipsoid; N is radius of curvature in Prime Vertical.
The algorithm consists two parts. One is to locate the 2D scene viewpoint through the 3D scene view, and the other is to locate the 3D scene viewpoint through the 2D scene view.

3D scene locate 2D scene
3D scene locate 2D scene is to calculate the position of the centre point of the 2D scene and the map display range through the attitude information such as viewpoint position, angle, and height of 3D scene. Figure 4 shows the algorithm schematic. P is the viewpoint of 2D scene, M is the centre point of the map in 2D scene, and the plane ABEF is the visible area of the map. From the characteristics of the 2D GIS, the PM is always perpendicular to the plane ABEF; The point O is the viewpoint of the 3D scene; The line OM is the direction of sight; The FOV is the view angle of viewpoint, The line OC and line OD the upper and lower limits of sight; the pitch is the angle between sight and horizon. The problem is transformed into the coordinates of the viewpoint O and the line OM of sight are known in 3D scene, and the x and y coordinates of the point P and the length of the line PM are to be solved. The length of the line PM is related to the display ratio of the 2D planar map system. Different 2D GIS have different viewpoint height and range ratios. In this paper, only the length of the line AB is solved, and the length of the line PM is obtained by a coefficient δ , the coefficient is determined by the specific 2D GIS system used.

2D scene locate 3D scene
2D scene locate 3D scene is to calculate the position and attitude parameters of 3D scene through the centre point location and the map display range of 2D scene. According to the rules agreed in this paper, the pitch angle and FOV are both constant. The problem is translated into: In 2D scene, the coordinates and height of the viewpoint P are already known, the x, y, z space coordinates of view O in 3D scene are to be solved. Let the coordinates of the viewpoint P of the 2D scene be , , and the visible range AB = .

4Application and Implementation effect
The interactive response implementation mechanism of 2D scene and 3D scenes depends mainly on the unique correspondence between the coordinate systems or the names of the geographic (Objects). The implementation of the program is mainly accomplished by sending a message (message-based) and handling the user-operated which called event processor (event-driven). When the user is roaming in a 3D virtual scene, the 2D scene shows the corresponding position and viewpoint height. This is achieved by tracking the position of the viewpoint in the 3D scene synchronously. When the position of the observer changes due to user operation in the 3D scene, the event processor sends a message to the 2D scene, and the position of the viewpoint is obtained through the interactive algorithm, so that the viewpoint of the 2D scene also jumps to the corresponding position. When the position of the observer changes due to user operations in the 2D scene, the event processor sends a message to the 3D scene to change the position and direction of the observer synchronously, and the position, height, pitch, angle, view range, roll angle and orientation in the 3D scene are obtained by the interactive algorithm. Correspondingly, the viewpoint of the 3D virtual scene can jump to the corresponding position.
Taking a virtual city region as experimental object and constructing a virtual city include 2D map and 3D scene based on the open source engine of 2D&3D graphics, the synchronously interactive response between 2D map and 3D scene is realized. The results are shown in Figure. 5. On the left is a view of 3D scene and on the right is a view of 2D scene. In order to verify the accuracy of the algorithm, three different views are used as contrasts. It can be seen that the display scope of 2D scene and 3D scene are basically the same at different angles, and the target building is always in the centre of the scene, avoiding the situation of scene mismatch. The integration of 2D maps scenes and 3D virtual city scenes is an important applied method of the current GIS. It has great significance to establish the consistency of data representation and spatial analysis between the 2D GIS platform and the 3D GIS platform through the interactive mechanism. In order to make the geographical coordinates in the 2D map correspond to the spatial position of the 3D virtual city, in this paper, the interactive algorithm is proposed on the basis of building a virtual city system with 2D map and 3D virtual scene. When 2D scene is operated, the angle and height of viewpoint are transmitted to the 3D scene window, the geographical coordinates in 2D scene and the spatial coordinates in 3D scene are corresponded. When 3D scene is operated, the geographic coordinates and viewpoint elevations are transmitted to the window of 2D scene. Through the event triggering mechanism during interaction, the positions of two views are changed synchronously. When user manipulates one view in 2D or 3D, the others moves synchronously to the same area and angle. The results of three different situations shown that the algorithm can adapt to the synchronization of viewpoints under various user operations, which has practicality and wide application prospects.