H-GAN: the power of GANs in your Hands

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/114586
Registro completo de metadatos
Registro completo de metadatos
Campo DCValorIdioma
dc.contributor3D Perception Labes_ES
dc.contributor.authorOprea, Sergiu-
dc.contributor.authorKarvounas, Giorgos-
dc.contributor.authorMartínez González, Pablo-
dc.contributor.authorKyriazis, Nikolaos-
dc.contributor.authorOrts-Escolano, Sergio-
dc.contributor.authorOikonomidis, Iason-
dc.contributor.authorGarcia-Garcia, Alberto-
dc.contributor.authorTsoli, Aggeliki-
dc.contributor.authorGarcia-Rodriguez, Jose-
dc.contributor.authorArgyros, Antonis-
dc.contributor.otherUniversidad de Alicante. Departamento de Tecnología Informática y Computaciónes_ES
dc.date.accessioned2021-04-28T16:34:34Z-
dc.date.available2021-04-28T16:34:34Z-
dc.date.created2020-
dc.date.issued2021-
dc.identifier.urihttp://hdl.handle.net/10045/114586-
dc.description.abstractWe present HandGAN (H-GAN), a cycle-consistent adversarial learning approach implementing multi-scale perceptual discriminators. It is designed to translate synthetic images of hands to the real domain. Synthetic hands provide complete ground-truth annotations, yet they are not representative of the target distribution of real-world data. We strive to provide the perfect blend of a realistic hand appearance with synthetic annotations. Relying on image-to-image translation, we improve the appearance of synthetic hands to approximate the statistical distribution underlying a collection of real images of hands. H-GAN tackles not only the cross-domain tone mapping but also structural differences in localized areas such as shading discontinuities. Results are evaluated on a qualitative and quantitative basis improving previous works. Furthermore, we relied on the hand classification task to claim our generated hands are statistically similar to the real domain of hand.es_ES
dc.description.sponsorshipSpanish Government PID2019-104818RB-I00 grant for the MoDeaAS project, supported with Feder funds. This work has also been supported by two Spanish national grants for PhD studies, FPU17/00166, and ACIF/2018/197 respectively.es_ES
dc.languageenges_ES
dc.rights© Universitat d'Alacant / Universidad de Alicante. Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)es_ES
dc.subjectSynthetic-to-reales_ES
dc.subjectGenerative adversarial networkses_ES
dc.subjectCycle-consistencyes_ES
dc.subjectPerceptual discriminatores_ES
dc.subject.otherArquitectura y Tecnología de Computadoreses_ES
dc.titleH-GAN: the power of GANs in your Handses_ES
dc.typesoftwarees_ES
dc.peerreviewednoes_ES
dc.relation.publisherversionhttps://arxiv.org/abs/2103.15017es_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.relation.projectIDinfo:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2019-104818RB-I00-
dc.relation.projectIDinfo:eu-repo/grantAgreement/MECD//FPU17%2F00166-
dc.rights.holderUniversidad de Alicante-
dc.rights.holderInstitute of Computer Science, FORTH, Greece-
Aparece en las colecciones:Registro de Programas de Ordenador y Bases de Datos

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailHGAN.pdfRepositorio H-GAN: the power of GANs in your Hands581,63 kBAdobe PDFAbrir Vista previa


Este ítem está licenciado bajo Licencia Creative Commons Creative Commons