We present the design of a multiuser networked wireless system to remotely configure and control the lighting of multiple webcam users at different locations. This system makes use of a Raspberry Pi and a wireless DMX transmitter as the wireless interface that can be used to control the DMX webcam lights. A lighting control software called OLA is used on the Raspberry Pi. A web interface is designed to issue commands to OLA API running on the Raspberry Pi to control DMX lights associated with Raspberry Pi. Multiple wireless interfaces, each for a specific user at a different location, can be simultaneously configured and managed using the web interface. The interactive web interface can be used to control the intensity and color of the DMX lights. The web interface follows a model controller view design and makes HTTP calls to the OLA software running on Raspberry pi. The proposed system enables an operator to provide optimum and artistic lighting effects for a group of online presenters.
more »
« less
Smart Webcam Cover: Exploring the Design of an Intelligent Webcam Cover to Improve Usability and Trust
Laptop webcams can be covertly activated by malware and law enforcement agencies. Consequently, 59% percent of Americans manually cover their webcams to avoid being surveilled. However, manual covers are prone to human error---through a survey with 200 users, we found that 61.5% occasionally forget to re-attach their cover after using their webcam. To address this problem, we developed Smart Webcam Cover (SWC): a thin film that covers the webcam (PDLC-overlay) by default until a user manually uncovers the webcam, and automatically covers the webcam when not in use. Through a two-phased design iteration process, we evaluated SWC with 20 webcam cover users through a remote study with a video prototype of SWC, compared to manual operation, and discussed factors that influence users' trust in the effectiveness of SWC and their perceptions of its utility.
more »
« less
- Award ID(s):
- 2029519
- PAR ID:
- 10352044
- Date Published:
- Journal Name:
- Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
- Volume:
- 5
- Issue:
- 4
- ISSN:
- 2474-9567
- Page Range / eLocation ID:
- 1 to 21
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The Re-Greening of the West African Sahel has attracted great interdisciplinary interest since it was originally detected in the mid-2000s. Studies have investigated vegetation patterns at regional scales using a time series of coarse resolution remote sensing analyses. Fewer have attempted to explain the processes behind these patterns at local scales. This research investigates bottom-up processes driving Sahelian greening in the northern Central Plateau of Burkina Faso—a region recognized as a greening hot spot. The objective was to understand the relationship between soil and water conservation (SWC) measures and the presence of trees through a comparative case study of three village terroirs, which have been the site of long-term human ecology fieldwork. Research specifically tests the hypothesis that there is a positive relationship between SWC and tree cover. Methods include remote sensing of high-resolution satellite imagery and aerial photos; GIS procedures; and chi-square statistical tests. Results indicate that, across all sites, there is a significant association between SWC and trees (chi-square = 20.144, p ≤ 0.01). Decomposing this by site, however, points out that this is not uniform. Tree cover is strongly associated with SWC investments in only one village—the one with the most tree cover (chi-square = 39.098, p ≤ 0.01). This pilot study concludes that SWC promotes tree cover but this is heavily modified by local contexts.more » « less
-
We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library, and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user's gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research.more » « less
-
Full-body tracking in virtual reality improves presence, allows interaction via body postures, and facilitates better social expression among users. However, full-body tracking systems today require a complex setup fixed to the environment (e.g., multiple lighthouses/cameras) and a laborious calibration process, which goes against the desire to make VR systems more portable and integrated. We present HybridTrak, which provides accurate, real-time full-body tracking by augmenting inside-out1 upper-body VR tracking systems with a single external off-the-shelf RGB web camera. HybridTrak uses a full-neural solution to convert and transform users’ 2D full-body poses from the webcam to 3D poses leveraging the inside-out upper-body tracking data. We showed HybridTrak is more accurate than RGB or depth-based tracking methods on the MPI-INF-3DHP dataset. We also tested HybridTrak in the popular VRChat app and showed that body postures presented by HybridTrak are more distinguishable and more natural than a solution using an RGBD camera.more » « less
-
We introduce SearchGazer, a web-based eye tracker for remote web search studies using common webcams already present in laptops and some desktop computers. SearchGazer is a pure JavaScript library that infers the gaze behavior of searchers in real time. The eye tracking model self-calibrates by watching searchers interact with the search pages and trains a mapping of eye features to gaze locations and search page elements on the screen. Contrary to typical eye tracking studies in information retrieval, this approach does not require the purchase of any additional specialized equipment, and can be done remotely in a user's natural environment, leading to cheaper and easier visual attention studies. While SearchGazer is not intended to be as accurate as specialized eye trackers, it is able to replicate many of the research findings of three seminal information retrieval papers: two that used eye tracking devices, and one that used the mouse cursor as a restricted focus viewer. Charts and heatmaps from those original papers are plotted side-by-side with SearchGazer results. While the main results are similar, there are some notable differences, which we hypothesize derive from improvements in the latest ranking technologies used by current versions of search engines and diligence by remote users. As part of this paper, we also release SearchGazer as a library that can be integrated into any search page.more » « less
An official website of the United States government

