University of Applied Sciences and Arts Western Switzerland, Fribourg Campus
E-mails: Omar.Aboukhaled@hefr.ch | Omar.Aboukhaled@hes-so.ch
Google Scholar ID: https://scholar.google.com/citations?user=AYzqC1wAAAAJ
People HES-SO: https://people.hes-so.ch/fr/profile/omar.aboukhaled?type=direct
Omar Abou Khaled
Bouvard de Pérolles 80
CP 32 – 1705 Fribourg
Omar Abou Khaled is Professor in computer science at HES-SO campus Fribourg (EIA-FR). He holds a PhD in Computer Science received from the Perception and Automatic Control Group of HEUDIASYC Laboratory of « Université de Technologie de Compiègne », and a Master in Computer Science from the same university. Since 1996 he has been working as research assistant in the MEDIA group of the Theoretical Computer Science Laboratory of EPFL (Ecole Polytechnique Fédérale de Lausanne) in the field of Educational Technologies and Web Based Training research field on MEDIT – (second link) and CLASSROOM 2000 ( second link) projects. He holds a Master in Computer Engineering (1991).
He was International Advisor at HES-SO until august 2014. He was Head of Internationale Relations Office at EIA-FR. Head of MRU « Advanced IT Systems Architects » at EIA-FR. Until 2007 He was leader of the MISG (Multimedia Information System Group). He is responsible of several projects in the field of Document Engineering, Multimodal Interfaces, Context Awareness, Ambient Intelligence, Blended Learning, and Content-Based Multimedia Retrieval. He is involved in the teaching of several courses related to Information Systems, Web Technologies & Multimodal Interfaces.
He was involved in the development of PRO-LAB-II (download my PhD report and see the demonstrator) (Pro-Art project a subgroup of the European Prometheus Eurêka project which aims to increase the traffic safety). He is also involved in many scientific activities: coordinator of several network of excellence (XML Academy Suisse Romande, Scientific Committee member of ANR//CONTINT, of RCSO-TIC (Réseau de Compétences – Technologies de l’information et de la communication de la HES-SO) and several conferences as TEI (International Conference on Tangible and Embedded Interaction), IHCI2007 (Interfaces and Human Computer Interaction 2007), CIDE (Conférence Internationale sur le Document Electronique). Finally, He has several international publications (more than 100) related to his research domains and He is a co-editor of special edition of « Document Numérique » concerning the theme « Document & Education ».
La société Deeplink est une startup d’intelligence artificielle spécialisée dans la technologie "chatbot". Cette entreprise utilise beaucoup le NLU (Natural Language Understanding) pour déterminer l’intention de l’utilisateur lorsqu’il communique avec le "chatbot". Actuellement la solution NLU est hébergée sur un serveur dédié et les échanges entre celle-ci et le client se font par requêtes http.
L’idée de ce projet consiste à faire tourner directement la solution NLU sur le navigateur du client. Avec une idée pareille, on assure qu’aucune requête contenant des données sensibles ne soit envoyée vers un serveur externe puisque l’inférence du modèle se fait sur le navigateur. En plus d’assurer une confidentialité bien plus sécurisée, si on déplace l’inférence du modèle on devrait obtenir de meilleure performance en termes de latence, d’extensibilité et on pourrait dans le futur développer des modèles personnaliser pour chaque personne.
A norovirus outbreak happened in France and was caused by raw shellfish and oysters. Switzerland, Sweden, Italy and the Netherlands have all also reported outbreaks linked to live oysters from France. Symptoms such as diarrhea, vomiting and incubation times, are consistent with norovirus or other enteric virus infections.
The number of people in France who have become ill after eating contaminated raw shellfish has jumped to more than 1,000. The outbreak has spurred international product recalls and many medias have talked about the Norovirus epidemic.
Social media services such as Twitter are valuable sources of information for surveillance systems. A digital syndromic surveillance system has several advantages including its ability to overcome the problem of time delay in traditional surveillance systems.
In this project, a Twitter-based data analysis system was developed to analyze the textual content of a set of tweets and see if there are any evidence of the Norovirus outbreak in France or Switzerland.
Social networks, and digital communication in general, have evolved at an impressive speed in recent years. They have enabled everyone to stay in constant contact with family members, co-workers or classmates. This technological progress has also brought with it a number of disadvantages, one of them being cyberbullying.
Cyberbullying, which is simply bullying that occurs on digital devices, is primarily directed at teens. In the past, this problem was more or less limited to school boundaries. But unfortunately, technology has removed these boundaries and so the bullying continues unabated, leaving no respite for the victims. The consequences are numerous and this phenomenon has already led to many suicides. It is therefore necessary to be able to detect cyberbullying on social networks and take action accordingly.
During this project, it soon became clear that the lack of existing resources, including datasets containing relatively recent cyberbullying texts, would complicate the task. Therefore, a slightly different approach has been adopted. Indeed, the objective has been changed. It was no longer a question of detecting cyberbullying, but rather of finding out whether or not a text containing insults was hateful.
To do so, around 4’000 tweets have been collected and labelled. From this dataset, different features have been extracted and different predictions, mainly based on random forest and neural networks models, have been realized. This process made it possible to identify the most useful features which were none other than the TFIDF values. Combining these features with a few others made it possible to reach an accuracy of 72.76%, a relatively low score for a binary classification problem.
It is possible that the model currently in use relies too much on the statistics of the various insults. For example, if one insult appears predominantly in positive samples rather than negative ones, the model will have difficulty in correctly predicting the samples containing this insult but whose class should be positive. Of course, the opposite is also true.
The issue of the structuration of textual data is very paramount in today’s technological reality. Textual data is a part of our daily life, we emit it, text it, tweet it, and receive it in huge bulks regularly, making the realm of data largely dependent on textual content that is highly unstructured by nature and unexploitable. In a world where Machine Learning, Big Data, and smart assistants are becoming the trend, companies are relying on these concepts to flourish their businesses. Therefore, the need for a platform permitting textual analytics techniques for all, becomes essential. This is where Wisely comes in handy. In a nutshell, Wisely provides two of the most used branches of Natural Language Processing: Named Entity Recognition and Natural Language Understanding. Using our platform, a non-technical user can import their own dataset, do the wanted treatments and export the results for future usages. This report has the intention of helping you get a better understanding of how Wisely works by giving you its implementation details from all the aspects.
Le coût environnemental et sociétal de la logistique traditionnelle, basée sur le modèle « Hub and spoke », est de plus en plus défavorable. La digitalisation permet de nouvelles approches qui ont montré leur efficacité dans le transport de personne (p.ex. BlaBlaCar) mais il reste une opportunité à saisir dans le domaine du transport de marchandises.More information
La collecte de cellules souches de cordon ombilical est précieuse. Elles permettent de traiter plu- sieurs maladies du sang, comme la leucémie, mais manque de moyens partout dans le monde. Que cela concerne un usage interne à sa famille ou un don à la collectivité, ces cellules sont souvent jetées faute d’argent pour les recueillir.
L'objectif du projet est de détecter les émotions induites par le stress à travers les micro-expressions révélées par le visage humain, dans le but de réduire le stress dans la vie quotidienne. Par ailleurs, cette étude développe un logiciel informatique capable de réagir en fonction de l'état affectif de l'utilisateur et de prendre des décisions intelligentes basées sur des indices non verbaux.
Privacy is a widely publicized topic. Personal data are scattered everywhere, often in possession of big companies like Google and Facebook. Messaging applications are particularly affected as we communicate on a daily basis with our loved ones through these applications.
The main objective is to develop a tool that will allow an user to fetch the important data from his conversations, like the locations mentioned, the list of people with whom the person discusses the most, the emotions, the personality, etc.
Ce projet, en collabaration avec Monsieur Hafis Bertschinger, fusionne l’art de cet artiste avec la technologie moderne par le biais de la réalité augmentée.More information
The purpose of this project is to enhance the interaction between the user and HumanTech website by integrating a chatbot in HumanTech website that will be able to answer questions about the institute.More information
MeccanoLEGO is a VR workbench simulation program. Its main goal is to make using tools like laser cutters accessible and easy to use thanks to a virtual space in which the user can create their own three-dimensional models and generate flat cutouts that fold into the desired form.
Asbestos is a highly toxic silicate mineral that has been used widely in many products for its insulating, non-flammable and heat resistant properties. Exposition to high concentrations of asbestos may lead to chronic inflammation of the lungs and cancer. After the toxicity has become known, much effort was undertaken - and is up to this day - in removing asbestos from buildings, roofs and other materials used in industry and in the public. Asbestos detection is a manual, complex and time-intensive process, that requires an experienced expert in order to have consistent and correct results. In an attempt to reduce manual labor and increase consistency in detection, machine learning models have been recently adopted to automate the detection process.More information