OpenVX is a recently ratified standard that was expressly proposed to facilitate the design of computer-vision (CV) applications used in real-time embedded systems. Despite its real-time focus, OpenVX presents several challenges when validating real-time constraints. Many of these challenges are rooted in the fact that OpenVX only implicitly defines any notion of a schedulable entity. Under OpenVX, CV applications are specified in the form of processing graphs that are inherently considered to execute monolithically end-to-end. This monolithic execution hinders parallelism and can lead to significant processing-capacity loss. Prior work partially addressed this problem by treating graph nodes as schedulable entities, but under OpenVX, these nodes represent rather coarse-grained CV functions, so the available parallelism that can be obtained in this way is quite limited. In this paper, a much more fine-grained approach for scheduling OpenVX graphs is proposed. This approach was designed to enable additional parallelism and to eliminate schedulability-related processing-capacity loss that arises when programs execute on both CPUs and graphics processing units (GPUs). Response-time analysis for this new approach is presented and its efficacy is evaluated via a case study involving an actual CV application.
more »
« less
Reflex Theory, Cautionary Tale: Misleading Simplicity in Early Neuroscience
This paper takes an integrated history and philosophy of science approach to the topic of "simplicity out of complexity". The reflex theory was a framework within early 20th century psychology and neuroscience which aimed to decompose complex behaviours and neural responses into simple reflexes. It was controversial in its time, and did not live up to its own theoretical and empirical ambitions. Examination of this episode poses important questions about the limitations of simplifying strategies, and the relationship between simplification and the engineering approach to biology.
more »
« less
- Award ID(s):
- 1921821
- PAR ID:
- 10309297
- Editor(s):
- Schickore, Jutta
- Date Published:
- Journal Name:
- Integrated HPS Conference Proceedings
- Volume:
- 1
- Issue:
- 1
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Multifunctional crops can simultaneously contribute to multiple societal objectives. As a result, they represent an attractive means for improving rural livelihoods. Moringa oleifera is an example of a multifunctional crop that produces nutritious leaves with uses as food, fodder, and a biostimulant to enhance crop growth. It yields seeds containing a water purifying coagulant and oil with cosmetic uses and possible biofuel feedstock. Despite Moringa oleifera's (and other multifunctional crops') various Food-Energy-Water uses, optimizing the benefits of its multiple uses and livelihood improvements remains challenging. There is a need for holistic approaches capable of assessing the multifunctionality of agriculture and livelihood impacts. Therefore, this paper critically evaluates Moringa oleifera's Food-Energy-Water-Livelihood nexus applications to gain insight into the tradeoffs and synergies among its various applications using a systems thinking approach. A systems approach is proposed as a holistic thinking framework that can help navigate the complexity of a crop's multifunctionality. The “Success to the Successful” systems archetype was adopted to capture the competition between the need for leaf yields and seed yields. In areas where there is energy and water insecurity, Moringa oleifera seed production is recommended for its potential to coproduce oil, the water purifying coagulant, and a residue that can be applied as a fertilizer. In areas where food insecurity is an issue, focusing on leaf production would be beneficial due to its significance in augmenting food for human consumption, animal feed, and its use as a biostimulant to increase crop yields. A causal loop diagram was found to effectively map the interconnections among the various uses of Moringa oleifera and associated livelihood improvements. This framework provides stakeholders with a conceptual decision-making tool that can help maximize positive livelihood outcomes. This approach can also be applied for improved management of other multifunctional crops.more » « less
-
Abstract The approach-avoidance task (AAT) is an implicit task that measures people’s behavioral tendencies to approach or avoid stimuli in the environment. In recent years, it has been used successfully to help explain a variety of health problems (e.g., addictions and phobias). Unfortunately, more recent AAT studies have failed to replicate earlier promising findings. One explanation for these replication failures could be that the AAT does not reliably measure approach-avoidance tendencies. Here, we first review existing literature on the reliability of various versions of the AAT. Next, we examine the AAT’s reliability in a large and diverse sample ( N = 1077; 248 of whom completed all sessions). Using a smartphone-based, mobile AAT, we measured participants’ approach-avoidance tendencies eight times over a period of seven months (one measurement per month) in two distinct stimulus sets (happy/sad expressions and disgusting/neutral stimuli). The mobile AAT’s split-half reliability was adequate for face stimuli ( r = .85), but low for disgust stimuli ( r = .72). Its test–retest reliability based on a single measurement was poor for either stimulus set (all ICC1s < .3). Its test–retest reliability based on the average of all eight measurements was moderately good for face stimuli (ICCk = .73), but low for disgust stimuli (ICCk = .5). Results suggest that single-measurement AATs could be influenced by unexplained temporal fluctuations of approach-avoidance tendencies. These fluctuations could be examined in future studies. Until then, this work suggests that future research using the AAT should rely on multiple rather than single measurements.more » « less
-
The use of industry foundation classes (IFC) data can facilitate interoperability of building information modeling (BIM) among different applications to alleviate the problems of information missing and inconsistency. By virtue of its goodwill of transparency and openness, IFC data can be opened and viewed in any text editor. But it normally requires a significant amount of effort when manually interpreting IFC data, due to (1) its large number of entities; and (2) the complex connections between one entity and another. On the other hand, the explanations of IFC entities in the IFC schema specifications are difficult to understand or verify. To address such difficulties, in this paper, an empirical data-driven approach is proposed for achieving a systematic understanding of entity definitions in an IFC schema. The approach utilizes IFC data and schema in a synergistic way, to facilitate such systematic understanding. Experimental testing is used to serve as verifications of the understanding and accrue the understanding, along with which byproduct BIM tools will be developed. The proposed approach was tested on understanding entities for geometric representations in the IFC2X3_TC1 schema. Through the experimental testing, systematic understanding of 62 IFC entities were obtained, and a visualization algorithm was developed and implemented based on this understanding.more » « less
-
The concept of a blockchain was invented by Satoshi Nakamoto to maintain a distributed ledger. In addition to its security, important performance measures of a blockchain protocol are its transaction throughput and confirmation latency. In a decentralized setting, these measures are limited by two underlying physical network attributes: communication capacity and speed-of-light propagation delay. In this work we introduce Prism, a new proof-of-work blockchain protocol, which can achieve 1) security against up to 50% adversarial hashing power; 2) optimal throughput up to the capacity C of the network; 3) confirmation latency for honest transactions proportional to the propagation delay D, with confirmation error probability exponentially small in the bandwidth-delay product CD; 4) eventual total ordering of all transactions. Our approach to the design of this protocol is based on deconstructing Nakamoto’s blockchain into its basic functionalities and systematically scaling up these functionalities to approach their physical limits.more » « less