Article citation information:

Wnorowski, J., Łebkowski, A. Ship information systems using smartglasses technology. Scientific Journal of Silesian University of Technology. Series Transport. 2018, 100,
211-222. ISSN: 0209-3324.



Jakub WNOROWSKI[1], Andrzej ŁEBKOWSKI[2]






Summary. New technology in the maritime sector often forces shipowners to assemble many new devices on their ship, which can increase the safety of sea travel. With each additional device on the navigation bridge, there are additional sets of data that need to be observed. The following article describes one of the possibilities of using augmented reality technology to support navigational decisions. The research used “smartglasses” technology and AR glasses from Meta Glasses.

Keywords: augmented reality; AR technology; navigation bridge.





Augmented reality (AR) is a technology that is consistently gaining more and more users. Every day, it is used in mobile games and car displays (HUD) and as modern city guides, where a tablet or smartphone with a suitable application can replace a book. AR technology usually employs a computer, which processes the image obtained from the camera, then displays digitally

generated items on the screen. Nowadays, AR glasses are increasingly used as screens. Currently, companies are outdoing themselves in the production of smaller and more comfortable glasses. There are currently models that look similar to ordinary prescription glasses:


Fig. 1. Google “Glass” [16]


AR technology is found not only in the entertainment industry, but also in various professional applications:

·         Healthcare

On daily basis, medical applications using AR technology can be used to improve our lives. There are many applications on the market, which, when integrated with additional devices, e.g., with smart bands, can monitor the pulse, count steps or act as a personal trainer. AR technology can also be used to indicate the distribution of public medical devices, e.g., defibrillators [12]



Fig. 2. Example view from a defibrillator search application [12]


·       Aviation

In the aviation sector, AR technology can be found at every level. From the design stage of aircraft to management and display devices in pilot cockpits. Starting from modelling the device in virtual space, you can eliminate later defects. More and more engineering companies are implementing this kind of solution. In the aviation sector, one such company is Pratt & Whitney, which not only uses technology to apply AR in the production of engines, but also to train their mechanics [21]



Fig. 3. An engineer visualizes a mechanical hologram [21]


An interesting solution for displaying data in the pilot‘s cockpit has been presented by Aero Glass, using smartglasses technology, Android software and a special board to recognize the position of the pilot‘s head. Aero Glass can visualize terrain, navigation traffic, instrument, weather and airspace information with access to vital safety procedures and protocols [22].



Fig. 4. View through Aero Glass smartglasses [22]


·       Transit systems

The best example of the use of AR technology in road transport can be found in various smart GPS concepts. For example, Sygic has created its own application using the GPS module of a mobile phone and a camera for positioning digital elements on a real image. In turn, the driver does not have to focus on reading the map. Instead, they follow a virtual path in the preview of the smartphone‘s camera (Figure 5) [23].



Fig. 5. Sygic AR GPS [23]


Another intelligent GPS approach has been proposed by WayRay. Instead of using phones, the company decided to display data, such as the trajectory of a car’s movement or the speed on the car‘s windshield. There is also an option to display information on the glass about restaurants, pubs, street names etc. (Figure 6) [24].



Fig. 6. WayRay AR GPS [24]


·         Maritime sector

AR technology in the maritime sector has appeared relatively recently, but it is growing rapidly. One of the companies dealing with this technology is a company that every seafarer knows - MarineTraffic. The company has created an application that uses a smartphone magnetometer to orient the device relative to the earth‘s magnetic field. On this basis, any information about vessels, ports and light signs near the device are displayed [20].



Fig. 7. MarineTraffic’s mobile application [20]


Another example of an application using AR technology has recently been presented by Japan’s Mitsui O.S.K. Lines and Furuno Electric Co. These two companies decided to jointly develop an application that displays every piece of important information based on data from AIS. Japan’s Mitsui O.S.K. Lines announced that it would like to combine this application with data obtained from radars and implement algorithms for avoiding collisions [27].



Fig. 7. Application from Japan‘s Mitsui O.S.K. Lines and Furuno Electric Co. [27]





Currently, the navigation bridge is built in such a way that all relevant devices are spread out, so that the navigator has to walk from one side to the other to read the necessary information.



Fig. 8. Example of a navigation bridge [9]


Using AR technology, we are able to display all information in one place. Moreover, the navigation officer does not have to be directly on the navigation bridge to read the data.


·         Head-up display

One of the first projects to display navigational data on the navigating bridge panes was proposed by a team of researchers from the Japanese Institute of Navigation, led by Kenjiro Hikida. They presented a transparent screen on which they displayed data such as object names, headings and distances between objects and speed [3].

One of the companies that has presented its own decision support system using AR is Rolls-Royce. This system displays necessary information on a specially crafted bridge window, including parameters of passing objects, data on hydrometeorological conditions, digital map projections or the distance to nearby objects. Some of the information provided can be displayed three-dimensionally. This future bridge project is shown in Figures 10 and 11:



Fig. 9. The concept of the navigation bridge from Rolls-Royce (1) [9]



Fig. 10. The concept of the navigation bridge from Rolls-Royce (2) [9]


The above solution looks interesting, but the modernization of the existing bridge would be extremely costly. The second aspect of this solution is that the visibility of data in full sunlight cannot be determined. All visualizations are made for a dark background, so it seems that even the creators have predicted that, in the light of day, this solution will not work.

In order to reduce the possible costs related to the bridge reconstruction, Meta Glasses’ glasses were used for the research. Thanks to the use of these glasses, we do not need to get rid of the devices from navigation bridge. This alone introduces the redundancy of the device, which is very important. In addition, the eyewear operator has access to all data from anywhere on the ship.


·         AR glasses

The first device using AR technology was presented by Ivan Sutherland in the 1960s at Harvard University. He called it the “Sword of Damocles”. It was used to display the grid under the user [15]. In 2000, Daniel Wagner and Dieter Schmalstieg created the first library enabling the creation of applications that used AR on mobile phones. Given this invention, the popularity of AR began to grow [15].

The first AR glasses were released in 2014 by Google. While this made AR available to everyone, the project quickly collapsed because the glasses were found to be uncomfortable. That said, in subsequent years, companies producing AR glasses have started to come into existence. One of these companies is Meta Glasses, which has created the “Meta 1” and “Meta 2” glasses. Thanks to the developer versions of these glasses, the creation of a highly advanced application using AR has become possible for ordinary people [15].





During the creation of the research system, the following design assumptions were adopted:

·    The application should process real-time data retrieved from the AIS device, then create a graphical representation based on them.

·    The user should be able to interact with objects using hand movements, hand gestures and voice commands.

·    The user should have a full 360° field of view; however, the object is displayed depending of the current direction in which the head of the operator is turned.

·    The application should allow for observation around the ship, even when the user is not on the navigation bridge.


In order to examine the possibility of using AR technology on ship, a set of AR glasses from Meta Glasses and the “Unity 3D” game engine were used. The AIS device used for communication referred to the NMEA0183 standard. Serial communication, with the program written in C# language, was used to read the AIS information. Next, the information was decoded and transferred to the main part of program. The system schematic is as presented in Figure 12.



Fig. 12. Simple system schematic


Each geographic coordinate read from the AIS device has been transformed from a geographic coordinate system (Lat, Long) into a Cartesian coordinate system (x,y), using the following mathematical functions:






In order to automate the conversion of the geographic coordinates into Cartesian ones, the following functions were written in C#:


public void LatLongToPixelXY(double latitude, double longitude, double levelofDetail, out intpixelX, out int pixelY)





        sinLatitude=Math.Sin(latitude*Math.PI/180);y=0.5-Math.Log((1+sinLatitude)/(1-                    sinLatitude))/(4*Math.PI);





private uint MapSize(double levelOfDetail)


        return (uint)(256*Math.Pow(2,levelOfDetail));


private double Clip(double n, double minValue, double maxValue)


        return Math.Min(Math.Max(n,minValue),maxValue);



The above code is used to transform geographic coordinates into Cartesian one for Mercador mapping, which is used in marine navigation systems. This is open-source code, which can be found on the Microsoft website [11]. After receiving the Cartesian coordinates, 3D objects imitating ships were placed in the appropriate places. Each object stored information about the MMSI number, geographic coordinates, speed and course of the vessel. A general diagram for the software algorithm is shown in Figure 13.



Fig. 13. Software algorithm


Example scenes from the application are shown in Figures 14-15.



Fig. 14. An example view of the navigational situation using AR Goggles



Fig. 15. Window with the parameters of the indicated vessel in the AR display system





The use of glasses together with AR technology as a tool for decision support systems seems to be a good starting point when it comes to using AR on ships for the following reasons:

·    The proposed system does not involve significant costs.

·    A lot of information can be displayed on very small area using gesture and voice commands.

·    Redundancy of navigation devices

·    Wide prospects for system expansion.


Regarding the latter, it is possible to easily expand the system with further interfaces with new data sets. Thanks to the use of AR glasses, it is possible to display a 3D cross-section of objects, e.g., the course of the seabed, which would greatly help navigators on offshore ships.





1.             Gierusz W. 2015. “Simulation model of the LNG carrier with podded propulsion. Part 1: Forces generated by pods”. Ocean Engineering 108: 105-114. DOI: 10.1016/j.oceaneng.2015.07.031.

2.             Gierusz W. 2006. “Logic thrust allocation applied to multivariable control of the training ship”. Control Engineering 14(5): 511-524. DOI: 10.1016/j.conengprac.2005.03.005.

3.             Hikida K. 2010. Development of a Shipboard Visual Lookout Support System with Head-up Display. Tokyo: Navigation System Research Group, Navigation and Logistics Engineering Department, National Maritime Research Institute.

4.             Lisowski J. 2012. “The optimal and safe ship trajectories for different forms of neural state constraints”. Mechatronic Systems, Mechanics and Materials. Book series: Solid State Phenomena 18/0: 64-69. DOI: 10.4028/

5.             Lisowski J. 2012. “Game control methods in avoidance of ships collisions”. Polish Maritime Research 19(1): 3-10. DOI: 10.2478/v10012-012-0016-4.

6.             Lisowski J. 2013. “Sensitivity of Computer Support Game Algorithms of Safe Ship Control”. International Journal of Applied Mathematics and Computer Science 23 (2): 439-446. DOI: 10.2478/amcs-2013-0033.

7.             Lisowski J. 2014. “Computational intelligence methods of a safe ship control”. In XVIII Annual Conference KES-2014 “Knowledge-Based and Intelligent Information & Engineering Systems”: 634-643. DOI: 10.1016/j.procs.2014.08.145.

8.             Moseley K. 2017. “Global smart glasses market 2017 - Google Glass, Carl Zeiss, Vuzix and Sony”. Available at:

9.             Wartsila SAM Electronics. “Navigation”. Available at:

10.         The Maritime Executive, Rolls-Royce. “VTT Unveil Vision of Ship Intelligence”. Available at:

11.         Schwartz J. “Bing Maps tile system”. Available at:

12.         The Medical Futurist Institute. “The top 9 augmented reality companies in healthcare”. Available at:

13.         Shukla A. “IAF order worth Rs 250 crores to Indian industry: Samtel cockpit displays cleared for the Su-30 MKI”. Available at:

14.         Sygic. “Sygic incorporates augmented reality into its GPS navigation app”. Available at:

15.         Charara S., L. Prasuethsut. “Everything you need to know about augmented reality: then, now & next”. Available at:

16.         Meta, SDK 2.5.0 Release Notes. Available at:

17.         Proceedings of the 17th DASC. AIAA/IEEE/SAE. Digital Avionics Systems Conference (Cat. no. 98CH36267). Bellevue, WA. 31 October-7 November 1998. IEEE.

18.         Yoon, C., K. Kim,, S. Baek, S.Y. Park. 2014. “Development of augmented in-vehicle navigation system for head-up display”. In 2014 International Conference on Information and Communication Technology Convergence (ICTC): 601-602. Busan, South Korea. 22-24 October 2014. IEEE. DOI: 10.1109/ICTC.2014.6983221.

19.         Yoon C., K. Kim, H.S. Park, M.W. Park, S.K. Jung. 2014. “Development of augmented forward collision warning system for head-up display”. In 17th International IEEE Conference on Intelligent Transportation Systems (ITSC): 2277-2279. Qingdao, China. 8-11 October 2014. IEEE. DOI: 10.1109/ITSC.2014.6958054.

20.         MarineTraffic. “Use the augmented reality tool”. Available at:

21.         Woodrow B. “9 companies using augmented and virtual reality in aviation”. Available at:

22.         Glass Aero. “Join the augmented reality revolution in aviation!”. Available at:

23.         GPS World Staff. “Sygic incorporates augmented reality into GPS navigation app”. Available at:

24.         GPS World Staff. “WayRay offers holographic navigation system for car”. Available at:

25.         Startupticker. “Alibaba invests in WayRay”. Available at:

26.         ThinkMobiles. “25 best augmented reality games 2017 for Android and iOS”. Available at:

27.         MarineLog. 2017. “MOL and Furuno to develop Augmented Reality enhanced displays”. Available at:

28.         Babu D., I.M.V.L.R. A. Sidhardhan. 2017. “Effects of intra-household interactions on travel behaviour of working people: a study of Calicut city, India”. European Transport/Trasporti Europei 66(4).



Received 21.03.2018; accepted in revised form 30.08.2018



Scientific Journal of Silesian University of Technology. Series Transport is licensed under a Creative Commons Attribution 4.0 International License

[1] Faculty of Electrical Engineering, Gdynia Maritime University, Morska 83 Street, 81-225 Gdynia, Poland. Email:

[2] Faculty of Electrical Engineering, Gdynia Maritime University, Morska 83 Street, 81-225 Gdynia, Poland. Email: