skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Thursday, February 12 until 1:00 AM ET on Friday, February 13 due to maintenance. We apologize for the inconvenience.


Title: Triplet Loss-less Center Loss Sampling Strategies in Facial Expression Recognition Scenarios
Award ID(s):
2204445 2232048
PAR ID:
10412061
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
57th Annual Conference on Information Sciences and Systems (CISS)
Page Range / eLocation ID:
1 to 6
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The problem of ordinal classification occurs in a large and growing number of areas. Some of the most common source and applications of ordinal data include rating scales, medical classification scales, socio-economic scales, meaningful groupings of continuous data, facial emotional intensity, facial age estimation, etc. The problem of predicting ordinal classes is typically addressed by either performing n-1 binary classification for n ordinal classes or treating ordinal classes as continuous values for regression. However, the first strategy doesn't fully utilize the ordering information of classes and the second strategy imposes a strong continuous assumption to ordinal classes. In this paper, we propose a novel loss function called Ordinal Hyperplane Loss (OHPL) that is particularly designed for data with ordinal classes. The proposal of OHPL is a significant advancement in predicting ordinal class data, since it enables deep learning techniques to be applied to the ordinal classification problem on both structured and unstructured data. By minimizing OHPL, a deep neural network learns to map data to an optimal space where the distance between points and their class centroids are minimized while a nontrivial ordinal relationship among classes are maintained. Experimental results show that deep neural network with OHPL not only outperforms the state-of-the-art alternatives on classification accuracy but also scales well to large ordinal classification problems. 
    more » « less
  2. Preservation of genetic diversity is critical to the resilience of species in the face of global change. To meet international calls to preserve at least 90% of species’ genetic diversity, researchers and conservationists need a way to reliably predict genetic diversity loss resulting from human activities ( 1 ). On page 1431 of this issue, Exposito-Alonso et al. present a mathematical framework that elegantly bridges biodiversity and population genetics theory to model the relationship between genetic diversity and habitat loss ( 2 ). This approach builds on methods already used by biodiversity policy experts for predicting species extinctions based on habitat loss ( 3 ) and should be useful to those tasked with setting goals for preserving genetic diversity. 
    more » « less