Publication List‎ > ‎

International Conference


Webized 3D experience by HTML5 annotation in 3D Web

posted Jun 16, 2015, 8:10 PM by Byounghyun Yoo   [ updated Jul 7, 2015, 6:19 PM ]

Daeil Seo, Byounghyun Yoo*, and Heedong Ko, ACM International Conference on 3D Web Technology (Web3D), Heraklion, Crete, Greece, pp.73-80,  June 18-21, 2015. 
(*Corresponding author) 

Abstract: With the development of 3D Web technologies, 3D objects are now handled as embedded objects without plug-ins on web pages. Although declarative 3D objects are physically integrated into web pages, 3D objects and HTML elements are still separated from the perspective of the 3D layout context, and an annotation method is lacking. Thus it is scarcely possible to add meaningful annotations related to target 3D objects using existing web resources. In addition, people often lose the relationship between the target and related annotation objects in a 3D context due to the separation of the content layouts in different 3D contexts. In this paper, we propose a webizing method for annotating user experiences with 3D objects in a 3D Web environment. The relationship between the 3D target object and the annotation object is declared by means of web annotations and these related objects are rendered with a common 3D layout context and a camera perspective. We present typical cases of 3D scenes with web annotations on the 3D Web using a prototype implementation system to verify the usefulness of our approach.

Keywords: 3D Web, annotation, display algorithms, HTML5, user experience, virtual reality, webizing 

DOI10.1145/2775292.2775301

Conference websitehttp://web3d2015.web3d.org

Tour Cloud Mobile: Helping tourists acquire the information effectively using three types of views

posted Oct 7, 2014, 7:52 PM by Byounghyun Yoo   [ updated Aug 29, 2015, 9:50 AM ]

Jungbin Kim, Byounghyun Yoo*, and Heedong Ko, IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, pp.673-674, January 9-12, 2015. 
(*Corresponding author) 

Abstract: This paper presents a mobile application that helps tourists acquire information using three types of views—AR, map, and list view—each of which has different drawbacks and benefits. To deal with the disadvantages of each view, we propose a view-management component to change the layout depending on the context. 

Keywords: content, tour, mobile AR, view management, list, map

Conference websitehttp://www.icce.org

Collective heterogeneous sensor mashup for enriched personal healthcare activity logging

posted Jul 26, 2014, 6:26 PM by Byounghyun Yoo   [ updated Aug 29, 2015, 9:50 AM ]

Daeil Seo, Byounghyun Yoo*, and Heedong Ko, IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, pp.34-35, January 9-12, 2015. 
(*Corresponding author) 

Abstract: There has been a proliferation of wearable devices and mobile apps for activities monitoring. However, no model has been put forth to complementarily integrate the monitored data. We propose an enriched activity model to reorganize personal activity logs from heterogeneous sensors and to visualize logs in a harmonized way with a meaningful summary. 

Keywords: healthcare system, healthcare informatics

Conference websitehttp://www.icce.org

Visual interaction for spatiotemporal content using zoom and pan with level-of-detail

posted Jul 5, 2014, 1:05 AM by Byounghyun Yoo   [ updated Aug 29, 2015, 9:50 AM ]

Daeil Seo, Byounghyun Yoo*, and Heedong Ko, IEEE Visualization Conference (VIS), Paris, France, November 9-14, 2014. 
(*Corresponding author) 

Abstract: It is easy to get lost while browsing and navigating spatiotemporal content, and the problem is exacerbated by the exponential growth of geospatial data with mobile phones and tablets. In this paper, we propose a continuous method of transition visualization with zooming and panning interaction of the maps and timelines on touch-based devices that maintains the spatial and temporal context of spatiotemporal content.

Keywords: spatiotemporal content, level-of-detail, zoom and pan, immersive visual interaction

[DEMO] Insight: Webized mobile AR and real life use cases

posted Jul 5, 2014, 1:04 AM by Byounghyun Yoo   [ updated Jan 18, 2016, 10:34 PM ]

Joohyun Lee, Jungbin Kim, Jinwoo Kim, Sungkuk Chun, Iltae Kim, Junsik Shim, Sangchul Ahn, Byounghyun Yoo*, and Heedong Ko, IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, September 10-12, 2014. 
(*Corresponding author) 

Abstract: This demonstration shows a novel approach for webizing mobile augmented reality, which uses HTML as its content structure, and its real life use cases. Insight is a mobile AR Web browser that executes HTML5-based AR applications. By extending physical objects and places with Uniform Resource Identifier (URI), we could build objects of interest for mobile AR application as DOM (Document Object Model) elements and control their behavior and user interactions through DOM events in standard HTML documents. A new CSS media type is defined to augment virtual objects to the physical objects. In this model, we introduce the concept of PLACE, which is the model of physical space in which the user can be located. With this approach, mobile AR applications can be developed as common HTML documents seamlessly under current Web eco-system. The advantage of the webized mobile AR, which is able to utilize all kind of Web resources without rework, and its high productivity arisen from the seamless development of AR contents in HTML documents are shown with real life use cases in various domains such as shopping, entertainment, education, and manufacturing. 

Keywords: authoring environment, content structure, mobile AR, real life, Web architecture, World Wide Web

View management for webized mobile AR contents

posted Jun 17, 2014, 7:31 PM by Byounghyun Yoo   [ updated Apr 5, 2015, 6:10 PM ]

Jungbin Kim, Joohyun Lee, Byounghyun Yoo*, Sangchul Ahn, and Heedong Ko, IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, September 10-12, 2014. 
(*Corresponding author) 

Abstract: Information presentation technique is important in augmented reality applications as it is in the traditional desktop user interface (WIMP) and Web user interface. This paper introduces view management for a web-based augmented reality mobile platform. We use a webized mobile AR browser called Insight that provides separation of the application logic including tracking engine and AR content, so that the view management logic and contents are easy to reuse. In addition, the view management is able to accommodate in-situ context of an AR application.

Keywords: content, layout management, mobile AR, view management, Web architecture

Integration of X3D Geospatial in a data driven Web application

posted Jun 2, 2014, 8:38 AM by Byounghyun Yoo   [ updated Jun 16, 2015, 8:31 PM ]

Michael McCann, Byounghyun Yoo, and Don Brutzman, ACM International Conference on 3D Web Technology (Web3D), Vancouver, Canada, pp.145-145, August 8-10, 2014.  

Abstract: The Monterey Bay Aquarium Research Institute designed the Spatial Temporal Oceanographic Query System (STOQS) to create new capabilities for scientists to gain insight from their data. STOQS employs open standards and is a 100% free and open source project. It includes a Web-based graphical user interface where X3D Geospatial has been integrated to enable 3D geospatial data visualization. 

Keywords: geospatial methodology and techniques, oceanography, sensor data, X3D, X3DOM

Conference websitehttp://web3d2014.web3d.org

A comparative study of 3D Web integration models for the Sensor Web

posted May 24, 2013, 5:10 PM by Byounghyun Yoo   [ updated Jun 16, 2015, 8:28 PM ]

Sangchul Ahn, Byounghyun Yoo*, and Heedong Ko, ACM International Conference on 3D Web Technology (Web3D), San Sebastian, Spain, pp.199-203, June 20-22, 2013. 
(*Corresponding author) 

Abstract: To facilitate the dynamic exploration of the Sensor Web and to allow users seamlessly to focus on a particular sensor system from a set of registered sensor networks deployed across the globe, interactive 3D graphics on the Web, which enables sensor data exploration featuring a good level of detail for a multi-scaled Sensor Web, is necessary. A comparative study of decent approaches to integrate Sensor Web information with the latest 3D Web technology and the geospatial Web has been conducted. We implemented prototype systems in three different 3D Web integration models, i.e., a common X3D model, the X3DOM integration model, and a webized AR content model. This paper presents examples of our prototype implementations using various approaches and discusses the lessons learned.

Keywords: 3D Web, augmented reality, geospatial Web, hypermedia, HTML 5, Sensor Web 

DOI10.1145/2466533.2466562

Conference websitehttp://web3d2013.web3d.org

Web-based information exploration of Sensor Web using the HTML5/X3D integration model

posted Nov 20, 2012, 8:22 PM by Byounghyun Yoo   [ updated Apr 5, 2015, 6:09 PM ]

Byounghyun Yoo*IADIS International Conference WWW/Internet, Madrid, Spain, October 18-21, 2012. 
(*Best paper award winner

Abstract: We investigate how the visualization of sensor resources on a 3D Web-based globe organized by level-of-detail can enhance search and exploration of information by easing the formulation of geospatial queries against the metadata of sensor systems. Our case study provides an approach inspired by geographical mashups in which freely-available functionality and data are flexibly combined. We use PostgreSQL, PostGIS, PHP, X3D-Earth and X3DOM to allow the Web3D standard and its geospatial component to be used for visual exploration and level-of-detail control of a dynamic scene. The proposed approach facilitates the dynamic exploration of the Sensor Web and allows the user to seamlessly focus in on a particular sensor system from a set of registered sensor networks deployed across the globe. We present a prototype metadata exploration system featuring levels-of-detail for a multi-scaled Sensor Web and use it to visually explore sensor data of weather stations.

Keywords: Web3D, Extensible 3D (X3D) Graphics, Sensor Web, X3DOM, HTML5

Conference website: http://www.internet-conf.org

Visualization and level-of-detail of metadata for interactive exploration of Sensor Web

posted Sep 8, 2011, 2:13 AM by Byounghyun Yoo   [ updated Jun 28, 2014, 12:09 AM ]

Byounghyun Yoo, Workshop on Sensor Web Enablement 2011 (SWE 2011), part of The 2011 Cybera Summit on Data For All, Banff, AB, Canada, October 6-7, 2011.

Abstract: There are several issues with Web-based search interfaces on a Sensor Web data infrastructure. It can be difficult to 1) find the proper keywords for the formulation of queries and 2) explore the information if the user does not have previous knowledge about the particular sensor systems providing the information. We investigate how the visualization of metadata on a 3D Web-based globe organized by level-of-detail can enhance search and exploration of information by easing the formulation of geospatial queries against the metadata of sensor systems. Our case study provides an approach inspired by geographical mashups in which freely-available functionality and data are flexibly combined. We use PostgreSQL, PostGIS, PHP and X3DEarth technologies to allow the Web3D standard and its geospatial component to be used for visual exploration and level-of-detail control of a dynamic scene. Our goal is to facilitate the dynamic exploration of the Sensor Web and allow the user to seamlessly focus in on a particular sensor system from a set of registered sensor networks deployed across the globe. We present a prototype metadata exploration system featuring levels-of-detail for a multi-scaled Sensor Web and use it to visually explore personal and project-based local weather stations.

Keywords: Sensor Web data visualization, Sensor Web data discovery and search, level-of-detail, metadata visualization, Web3D standard, Extensible 3D Graphics, X3D Geospatial Component


Presentation video: YouTube

1-10 of 18