DECSI - Departamento de Computação e Sistemas de Informação
Permanent URI for this community
Browse
Browsing DECSI - Departamento de Computação e Sistemas de Informação by Title
Now showing 1 - 20 of 71
Results Per Page
Sort Options
Item Ambiguity and constrained polymorphism.(2016) Figueiredo, Carlos Camarão de; Figueiredo, Lucília Camarão de; Ribeiro, Rodrigo GeraldoThis paper considers the problem of ambiguity in Haskell-like languages. Overloading resolution is characterized in the context of constrained polymorphism by the presence of unreachable variables in constraints on the type of the expression. A new definition of ambiguity is presented, where existence of more than one instance for the constraints on an expression type is considered only after overloading resolution. This introduces a clear distinction between ambiguity and overloading resolution, makes ambiguity more intuitive and independent from extra concepts, such as functional dependencies, and enables more programs to type-check as fewer ambiguities arise. The paper presents a type system and a type inference algorithm that includes: a constraint-set satisfiability function, that determines whether a given set of constraints is entailed or not in a given context, focusing on issues related to decidability, a constraint-set improvement function, for filtering out constraints for which overloading has been resolved, and a context-reduction function, for reducing constraint sets according to matching instances. A standard dictionary-style semantics for core Haskell is also presented.Item Ambiguity and context-dependent overloading.(2013) Ribeiro, Rodrigo Geraldo; Figueiredo, Carlos Camarão deThis paper discusses ambiguity in the context of languages that support context-dependent overloading, such as Haskell. A type system for a Haskell-like programming language that supports context-dependent overloading and follow the Hindley-Milner approach of providing contextfree type instantiation, allows distinct derivations of the same type for ambiguous expressions. Such expressions are usually rejected by the type inference algorithm, which is thus not complete with respect to the type system. Also, Haskell’s open world approach considers a definition of ambiguity that does not conform to the existence of two or more distinct type system derivations for the same type. The article presents an alternative approach, where the standard definition of ambiguity is followed. A type system is presented that allows only context-dependent type instantiation, enabling only one type to be derivable for each expression in a given typing context: the type of an expression can be instantiated only if required by the program context where the expression occurs. We define a notion of greatest instance type for each occurrence of an expression, which is used in the definition of a standard dictionary-passing semantics for core Haskell based on type system derivations, for which coherence is trivial. Type soundness is obtained as a result of disallowing all ambiguous expressions and all expressions involving unsatisfiability in the use of overloaded names. Following the standard definition of ambiguity, satisfiability is tested—i.e., “the world is closed” —if only if overloading is (or should have been) resolved, that is, if and only if there exist unreachable variables in the constraints on types of expressions. Nowadays, satisfiability is tested in Haskell, in the presence of multiparameter type classes, only upon the presence of functional dependencies or an alternative mechanism that specifies conditions for closing the world, and that may happen when there exist or not unreachable type variables in constraints. The satisfiability trigger condition is then given automatically, by the existence of unreachable variables in constraints, and does not need to be specified by programmers, using an extra mechanism.Item An advanced pruning method in the architecture of extreme learning machines using L1-regularization and bootstrapping.(2020) Souza, Paulo Vitor de Campos; Torres, Luiz Carlos Bambirra; Silva, Gustavo Rodrigues Lacerda; Braga, Antônio de Pádua; Lughofer, EdwinExtreme learning machines (ELMs) are efficient for classification, regression, and time series prediction, as well as being a clear solution to backpropagation structures to determine values in intermediate layers of the learning model. One of the problems that an ELM may face is due to a large number of neurons in the hidden layer, making the expert model a specific data set. With a large number of neurons in the hidden layer, overfitting is more likely and thus unnecessary information can deterioriate the performance of the neural network. To solve this problem, a pruning method is proposed, called Pruning ELM Using Bootstrapped Lasso BR-ELM, which is based on regularization and resampling techniques, to select the most representative neurons for the model response. This method is based on an ensembled variant of Lasso (achieved through bootstrap replications) and aims to shrink the output weight parameters of the neurons to 0 as many and as much as possible. According to a subset of candidate regressors having significant coefficient values (greater than 0), it is possible to select the best neurons in the hidden layer of the ELM. Finally, pattern classification tests and benchmark regression tests of complex real-world problems are performed by comparing the proposed approach to other pruning models for ELMs. It can be seen that statistically BR-ELM can outperform several related state-of-the-art methods in terms of classification accuracies and model errors (while performing equally to Pruning-ELM P-ELM), and this with a significantly reduced number of finally selected neurons.Item An efficient similarity-based approach for comparing XML documents.(2018) Oliveira, Alessandreia Marta de; Tessarolli, Gabriel Piton; Menezes, Gleiph Ghiotto Lima de; Pinto, Bruno; Campello, Fernando; Marques, Matheus; Oliveira, Carlos; Rodrigues, Igor; Kalinowski, Marcos; Souza, Uéverton dos Santos; Murta, Leonardo Gresta Paulino; Murta, Vanessa BraganholoXML documents are widely used to interchange information among heterogeneous systems, ranging from office applications to scientific experiments. Independently of the domain, XML documents may evolve, so identifying and understanding the changes they undergo becomes crucial. Some syntactic diffapproaches have been proposed to address this problem. They are mainly designed to compare revisions of XML doc- uments using explicit IDs to match elements. However, elements in different revisions may not share IDs due to tool incompatibility or even divergent or missing schemas. In this paper, we present Phoenix, a similarity-based approach for comparing revisions of XML documents that does not rely on explicit IDs. Phoenix uses dynamic programming and optimization algorithms to compare different features (e.g., ele- ment name, content, attributes, and sub-elements) of XML documents and calculate the similarity degree between them. We compared Phoenix with X-Diffand XyDiff, two state-of-the-art XML diffalgorithms. XyDiffwas the fastest approach but failed in providing precise matching results. X-Diffpresented higher efficacy in 30 of the 56 scenarios but was slow. Phoenix executed in a fraction of the running time re- quired by X-Diffand achieved the best results in terms of efficacy in 26 of 56 tested scenarios. In our evaluations, Phoenix was by far the most efficient approach to match elements across revisions of the same XML document.Item An integer programming approach to the multimode resource-constrained multiproject scheduling problem.(2015) Toffolo, Túlio Ângelo Machado; Santos, Haroldo Gambini; Carvalho, Marco Antonio Moreira de; Araujo, Janniele Aparecida SoaresThe project scheduling problem (PSP) is the subject of several studies in computer science, mathematics, and operations research because of the hardness of solving it and its practical importance. This work tackles an extended version of the problem known as the multimode resourceconstrained multiproject scheduling problem. A solution to this problem consists of a schedule of jobs from various projects, so that the job allocations do not exceed the stipulated limits of renewable and nonrenewable resources. To accomplish this, a set of execution modes for the jobs must be chosen, as the jobs’ duration and amount of needed resources vary depending on the mode selected. Finally, the schedule must also consider precedence constraints between jobs. This work proposes heuristic methods based on integer programming to solve the PSP considered in the Multidisciplinary International Scheduling Conference: Theory and Applications (MISTA) 2013 Challenge. The developed solver was ranked third in the competition, being able to find feasible and competitive solutions for all instances and improving best known solutions for some problems.Item An on-the-fly grammar modification mechanism for composing and defining extensible languages.(2015) Reis, Leonardo Vieira dos Santos; Iorio, Vladimir Oliveira Di; Bigonha, Roberto da SilvaAdaptable Parsing Expression Grammar (APEG) is a formal method for defining the syntax of programming languages. It provides an on-the-fly mechanism to perform modifications of the syntax of the language during parsing time. The primary goal of this dynamic mechanism is the formal specification and the automatic parser generation for extensible languages. In this paper, we show how APEG can be used for the definition of the extensible languages SugarJ and Fortress, clarifying many aspects of the syntax of these languages. We also show that the mechanism for on-the-fly modification of syntax rules can be useful for defining grammars in a modular way, implementing almost all types of language composition in the context of specification of extensible languages.Item Análise comparativa de detectores e descritores de características locais em imagens no âmbito do problema de autocalibração de câmeras.(2016) Brito, Darlan Nunes de; Pádua, Flávio Luis Cardeal; Lopes, Aldo Peres Campos e; Dalip, Daniel HasanEste trabalho apresenta uma análise comparativa de diferentes métodos do estado da arte para detecção e descrição de características locais em imagens, com o objetivo de solucionar de forma robusta e eficiente o problema de autocalibração de câmeras. Para atingir esse objetivo, é essencial a utilização de métodos detectores e descritores eficazes, uma vez que a correspondência robusta de características em um conjunto de imagens sucessivas sujeitas a uma ampla variedade de distorções afins e mudanças no ponto de vista 3D da cena, é crucial para a exatidão dos cálculos dos parâmetros da câmera. Muito embora diversos detectores e descritores têm sido propostos na literatura, seus impactos no processo de autocalibração de câmeras não foram ainda devidamente estudados. Nesse trabalho de análise comparativa, utilizam-se como critérios de qualidade da autocalibração os erros: epipolar, de reprojeção e reconstrução, bem como os tempos de execução dos métodos. Os resultados experimentais demonstram que detectores e descritores binários de características (ORB, BRISK e FREAK) e de ponto flutuante (SIFT e SURF) apresentam erros de reprojeção e reconstrução equivalentes. Considerando-se, porém, o menor custo computacional dos métodos binários, recomenda-se, fortemente, o uso destes em soluções de problemas de autocalibração de câmeras.Item Analysis of user Interaction with a brain-computer interface based on steady-state visually evoked potentials : case study of a game.(2018) Leite, Harlei Miguel de Arruda; Leite, Sarah Negreiros de Carvalho; Costa, Thiago Bulhões da Silva; Attux, Romis Ribeiro de Faissol; Hornung, Heiko Horst; Arantes, Dalton SoaresThis paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP).The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named “Get Coins,” through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user.Item Animação gráfica da marcha humana a partir de dados do Kinect.(2021) Leite, Edmo de Oliveira; Assis, Gilda Aparecida de; Yared, Glauco Ferreira GazelA análise da marcha humana a partir de dados biométricos tem aplicações em áreas como segurança, robótica bioinspirada e saúde. Sensores de movimento de baixo custo, como o Kinect, têm permitido a aquisição de dados biométricos da marcha em ambientes terrestres. Entretanto, esses equipamentos têm limitações que podem impactar na qualidade dos dados. Nesse cenário, diferentes técnicas de processamento de sinais podem ser aplicadas para reduzir o ruído. A visualização desses dados, originais ou processados, muitas vezes é realizada na forma de gráficos, tendo utilidade limitada para profissionais não experientes na análise de sinais. Nesse sentido, a visualização dos dados da marcha em um modelo tridimensional pode contribuir para melhorar a decisão dos profissionais, principalmente da saúde. Este trabalho tem como objetivo realizar a animação da marcha humana em um modelo tridimensional, a partir dos dados coletados pelo sensor Kinect 2.0. Para reduzir o ruído dos dados, foi realizado um pré-processamento com filtros de média móvel e Butterworth. Foram elaborados vídeos das animações conforme as vistas isométrica e lateral, que foram incorporados em um questionário on-line e avaliados em uma pesquisa de campo sobre artificialidade/naturalidade da animação, utilizando-se a técnica de pontuação média de opinião (mean opinion score [MOS]). Um total de 22 participantes, estudantes de computação, respondeu ao questionário on-line. A análise de variância simples (analysis of variance [Anova]) one way mostrou que os vídeos a partir das vistas isométrica e lateral processados com filtro de média móvel (janela = 15 e repetições = 3) que obtiveram maiores valores da métrica MOS foram avaliados como significativamente mais naturais do que outros vídeos, processados ou não.Item Assessment of a trap based Aedes aegypti surveillance program using mathematical modeling.(2018) Lana, Raquel Martins; Morais, Maíra Moreira; Lima, Tiago França Melo de; Carneiro, Tiago Garcia de Senna; Stolerman, Lucas Martins; Santos, Jefferson Pereira Caldas dos; Cortés, José Joaquín Carvajal; Eiras, Álvaro Eduardo; Codeço, Cláudia TorresThe goal of this study was to assess the goodness-of-fit of theoretical models of population dynamics of Aedes aegypti to trap data collected by a long term entomological surveillance program. The carrying capacity K of this vector was estimated at city and neighborhood level. Adult mosquito abundance was measured via adults collected weekly by a network of sticky traps (Mosquitraps) from January 2008 to December 2011 in Vitória, Espírito Santo, Brazil. K was the only free parameter estimated by the model. At the city level, the model with temperature as a driver captured the seasonal pattern of mosquito abundance. At the local level, we observed a spatial heterogeneity in the estimated carrying capacity between neighborhoods, weakly associated with environmental variables related to poor infrastructure. Model goodness-of-fit was influenced by the number of sticky traps, and suggests a minimum of 16 traps at the neighborhood level for surveillance.Item Automatic integer programming reformulation using variable neighborhood search.(2017) Brito, Samuel Souza; Santos, Haroldo GambiniChvatal-Gomory cuts are well-known cutting planes for Integer Programming problems. As shown in previous works, the inclusion of these cuts allows to significantly reducing the integrality gap. This work presents a Local Search heuristic approach based on Variable Neighborhood Search to discover violated Chv`atal-Gomory inequalities. Since this problem is known to be NP-hard, this approach was designed to generate violated inequalities in restricted amounts of time. Constraints are grouped in several sets, considering the amount of common variables. These sets are processed in parallel in order to obtain the best multipliers and produce violated cuts. We report some preliminary results obtained for MIPLIB 3.0 and 2003 instance sets, comparing our approach with an integer programming based separation method. Our algorithm was able to separate many violated inequalities, reducing the duality gap. Furthermore, it uses an extended numerical precision implementation, since it is not specifically bound to simplex based solvers.Item Caracterização das publicações e relações entre mídias alternativas polarizadas no Facebook.(2022) Laurett, Natan Siller; Ribeiro, Filipe NunesO presente trabalho desenvolve a caracterização de publicações e de relações entre páginas de mídias alternativas de direita e esquerda no Facebook ao longo de um período de 4 anos. Essa caracterização é realizada por meio de estatísticas de engajamento, referências externas nas publicações e detecção de comunidades através de semelhanças de publicações, modeladas como um grafo. Para esta última, foi utilizada extração de backbone utilizando o método do filtro de disparidade e detecção de comunidade por meio do algoritmo de Louvain. Os resultados mostram a predominancia das paginas de direita em termos de engajamento, e evidenciam fortes relaçoes de comunidades em 2018.Item Classifying unlabeled short texts using a fuzzy declarative approach.(2013) Romero, Francisco P.; Iranzo, Pascual Julián; Soto, Andrés; Satler, Mateus Ferreira; Casero, Juan GallardoWeb 2.0 provides user-friendly tools that allow persons to create and publish content online. User generated content often takes the form of short texts (e.g., blog posts, news feeds, snippets, etc). This has motivated an increasing interest on the analysis of short texts and, specifically, on their categorisation. Text categorisation is the task of classifying documents into a certain number of predefined categories. Traditional text classification techniques are mainly based on word frequency statistical analysis and have been proved inadequate for the classification of short texts where word occurrence is too small. On the other hand, the classic approach to text categorization is based on a learning process that requires a large number of labeled training texts to achieve an accurate performance. However labeled documents might not be available, when unlabeled documents can be easily collected. This paper presents an approach to text categorisation which does not need a pre-classified set of training documents. The proposed method only requires the category names as user input. Each one of these categories is defined by means of an ontology of terms modelled by a set of what we call proximity equations. Hence, our method is not category occurrence frequency based, but highly depends on the definition of that category and how the text fits that definition. Therefore, the proposed approach is an appropriate method for short text classification where the frequency of occurrence of a category is very small or even zero. Another feature of our method is that the classification process is based on the ability of an extension of the standard Prolog language, named Bousi*Prolog, for flexible matching and knowledge representation. This declarative approach provides a text classifier which is quick and easy to build, and a classification process which is easy for the user to understand. The results of experiments showed that the proposed method achieved a reasonably useful performance.Item A cloud computing price model based on virtual machine performance degradation.(2019) Leite, Dionisio Machado; Peixoto, Maycon Leone Maciel; Ferreira, Carlos Henrique Gomes; Batista, Bruno Guazzelli; Segura, Danilo Costa Marim; Santana, Marcos José; Santana, Regina Helena CarlucciThis paper reports the interference effects in virtual machines performance running higher workloads to improve the resources payment in cloud computing. The objective is to produce an acceptable pay-as-you-go model to be used by cloud computing providers. Presently, a price of pay-as-you-go model is based on the virtual machine utilised per time. However, this scheme does not consider the interference caused by virtual machines running concurrently, which may cause performance degradation. In order to obtain a fair charging model, this paper proposes an approach considering a recovery over the initial price considering the virtual machine performance interference. Results showed benefits of a fair pay-as-you-go model, ensuring the effective user requirement. This novel model contributes to cloud computing in a fair and transparent price composition.Item Combined weightless neural network FPGA architecture for deforestation surveillance and visual navigation of UAVs.(2020) Torres, Vitor Angelo Maria Ferreira; Jaimes, Brayan Rene Acevedo; Ribeiro, Eduardo S.; Braga, Mateus T.; Shiguemori, Elcio Hideiti; Velho, Haroldo Fraga de Campos; Torres, Luiz Carlos Bambirra; Braga, Antônio de PáduaThis work presents a combined weightless neural network architecture for deforestation surveillance and visual navigation of Unmanned Aerial Vehicles (UAVs). Binary images, which are required for position estimation and UAV navigation, are provided by the deforestation surveillance circuit. Learned models are evaluated in a real UAV flight over a green countryside area, while deforestation surveillance is assessed with an Amazon forest benchmarking image data. Small utilization percentage of Field Programmable Gate Arrays (FPGAs) allows for a higher degree of parallelization and block processing of larger regions of input images.Item Combined weightless neural network FPGA architecture for deforestation surveillance and visual navigation of UAVs.(2020) Torres, Vitor Angelo Maria Ferreira; Jaimes, Brayan Rene Acevedo; Ribeiro, Eduardo da Silva; Braga, Mateus Taulois; Shiguemori, Elcio Hideit; Velho, Haroldo Fraga de Campos; Torres, Luiz Carlos Bambirra; Braga, Antônio PáduaThis work presents a combined weightless neural network architecture for deforestation surveillance and visual navigation of Unmanned Aerial Vehicles (UAVs). Binary images, which are required for position estimation and UAV navigation, are provided by the deforestation surveillance circuit. Learned models are evaluated in a real UAV flight over a green countryside area, while deforestation surveillance is assessed with an Amazon forest benchmarking image data. Small utilization percentage of Field Programmable Gate Arrays (FPGAs) allows for a higher degree of parallelization and block processing of larger regions of input images.Item A comparison between cost optimality and return on investment forenergy retrofit in buildings - a real options perspective.(2016) Tadeu, Sérgio Fernando; Alexandre, Rafael Frederico; Tadeu, António J. B.; Antunes, Carlos Henggeler; Simões, Nuno A. V.; Silva, Patrícia Pereira datEuropean Union (EU) regulations aim to ensure that the energy performance of buildings meets thecost-optimality criteria for energy efficiency measures. The methodological framework proposed in EUDelegated Regulation 244 is addressed to national authorities (not investors); the optimal cost level iscalculated to develop regulations applicable at domestic level. Despite the complexity and the large num-ber of possible combinations of economically viable efficiency measures, the real options for improvingenergy performance available to decision makers in building retrofit can be established. Our study con-siders a multi-objective optimization approach to identify the minimum global cost and primary energyneeds of 154,000 combinations of energy efficiency measures. The proposed model is solved by the NSGA-II multi-objective evolutionary algorithm. As a result, the cost-optimal levels and a return on investmentapproach are compared for a set of suitable solutions for a reference building. Eighteen combinations ofretrofit measures are selected and an analysis of the influence of real options on investments is proposed.We show that a sound methodological approach to determining the advantages of this type of investmentshould be offered so that Member States can provide valuable information and ensure that the minimumrequirements are profitable to most investors.Item A cooperative coevolutionary algorithm for the multi-depot vehicle routing problem.(2016) Oliveira, Fernando Bernardes de; Enayatifar, Rasul; Sadaei, Hossein Javedani; Guimarães, Frederico Gadelha; Potvin, Jean YvesThe Multi-Depot Vehicle Routing Problem (MDVRP) is an important variant of the classical Vehicle Routing Problem (VRP), where the customers can be served from a number of depots. This paper introduces a cooperative coevolutionary algorithm to minimize the total route cost of the MDVRP. Coevolutionary algorithms are inspired by the simultaneous evolution process involving two or more species. In this approach, the problem is decomposed into smaller subproblems and individuals from different populations are combined to create a complete solution to the original problem. This paper presents a problem decomposition approach for the MDVRP in which each subproblem becomes a single depot VRP and evolves independently in its domain space. Customers are distributed among the depots based on their distance from the depots and their distance from their closest neighbor. A population is associated with each depot where the individuals represent partial solutions to the problem, that is, sets of routes over customers assigned to the corresponding depot. The fitness of a partial solution depends on its ability to cooperate with partial solutions from other populations to form a complete solution to the MDVRP. As the problem is decomposed and each part evolves separately, this approach is strongly suitable to parallel environments. Therefore, a parallel evolution strategy environment with a variable length genotype coupled with local search operators is proposed. A large number of experiments have been conducted to assess the performance of this approach. The results suggest that the proposed coevolutionary algorithm in a parallel environment is able to produce high-quality solutions to the MDVRP in low computational time.Item Data density-based clustering for regularized fuzzy neural networks based on nullneurons and robust activation function.(2019) Souza, Paulo Vitor de Campos; Torres, Luiz Carlos Bambirra; Guimarães, Augusto Júnio; Araújo, Vanessa Souza; Araújo, Vinicius Jonathan Silva; Rezende, Thiago SilvaThis paper proposes the use of fuzzification functions based on clustering of data based on their density to perform the granularization of the input space. The neurons formed in this layer are built through the density centers obtained with the input data of the model. In the second layer, the nullneurons aggregate the generated neurons in the first layer and allow the creation of if/then fuzzy rules. Even in the second layer, a regularization function is activated to determine the essential nullneurons. The concepts of extreme learning machine generate the weights used in the third layer, but with a regularizing factor. Finally, in the third layer, represented by an artificial neural network, it has a single neuron that the activation function uses robust functions to carry out the model. To verify the new training approach for fuzzy neural networks, we performed real and synthetic database tests for the pattern classification, which led to the conclusion that the data density-based approach the use of regularization factors in the second model layer and neurons with more robust activation functions allowed better results compared to other classifiers that use the concepts of extreme learning machine.Item DengueME : a tool for the modeling and simulation of dengue spatiotemporal dynamics.(2016) Lima, Tiago França Melo de; Lana, Raquel Martins; Carneiro, Tiago Garcia de Senna; Codeço, Cláudia Torres; Machado, Gabriel Souza; Ferreira, Lucas Saraiva; Medeiros, Líliam César de Castro; Davis Junior, Clodoveu AugustoThe prevention and control of dengue are great public health challenges for many countries, particularly since 2015, as other arboviruses have been observed to interact significantly with dengue virus. Different approaches and methodologies have been proposed and discussed by the research community. An important tool widely used is modeling and simulation, which help us to understand epidemic dynamics and create scenarios to support planning and decision making processes. With this aim, we proposed and developed DengueME, a collaborative open source platform to simulate dengue disease and its vector’s dynamics. It supports compartmental and individual-based models, implemented over a GIS database, that represent Aedes aegypti population dynamics, human demography, human mobility, urban landscape and dengue transmission mediated by human and mosquito encounters. A user-friendly graphical interface was developed to facilitate model configuration and data input, and a library of models was developed to support teaching-learning activities. DengueME was applied in study cases and evaluated by specialists. Other improvements will be made in future work, to enhance its extensibility and usability.