Papers by Luis Villalpando
A performance measurement model for cloud computing applications
Cloud computing is a technology aimed at processing and storing very large amounts of data which ... more Cloud computing is a technology aimed at processing and storing very large amounts of data which is also known as Big Data. One of the areas that have contributed to the analysis of Big Data is Data Science. This new study area is called Big Data Science (BDS). One of the challenges on implementing BDS into organizations is the current lack of information which helps to understand BDS. Thus, this chapter presents a framework to implement Big Data Science in organizations which describe the requirements and processes necessaries for such implementation.

Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access to a sha... more Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources. Cloud Computing users prefer not to own physical infrastructure, but instead rent Cloud infrastructure, a Cloud platform or software, from a third-party provider. Sometimes, anomalies and defects affect a part of the Cloud platform, resulting in degradation of the Cloud performance. One of the challenges in identifying the source of such degradation is how to determine the type of relationship that exists between the various performance metrics which affect the quality of the Cloud and more specifically Cloud applications. This work uses the Taguchi method for the design of experiments to propose a methodology for identifying the relationships between the various configuration parameters that affect the quality of Cloud Computing performance in Hadoop environments. This paper is based on a proposed performance measurement framework for Cloud Computing systems, which integrates software quality concepts from ISO 25010 and other international standards.

Resumen. -Este trabajo propone la utilización de servicios web haciendo uso de esquemas semántico... more Resumen. -Este trabajo propone la utilización de servicios web haciendo uso de esquemas semánticos para ser explotados por clientes fijos y móviles, esto como una solución al problema de adaptación de sistemas existentes que cooperen entre sà y proporcionen soluciones de intercambio de conocimiento entre servicios para que al mismo tiempo otorguen mayor información relevante a usuarios que se encuentran dentro de las universidades mexicanas. Presentamos el uso de la web semántica dentro de la creación de servicios web en las universidades tales como servicios de impresión o bibliotecarios, esto como una forma de intercambiar información de una manara más inteligente, posteriormente mostramos como organizar el conocimiento de los servicios a través de esquemas ontológicos como OWL para poder realizar inferencias y tener una mejor utilización de los mismos, por último proponemos una estructura que soporte la interacción entre los proveedores de servicios, el administrador de los mismo...

Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access to a sha... more Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources. Cloud Computing users prefer not to own physical infrastructure, but instead rent Cloud infrastructure, a Cloud platform or software, from a third-party provider. Sometimes, anomalies and defects affect a part of the Cloud platform, resulting in degradation of the Cloud performance. One of the challenges in identifying the source of such degradation is how to determine the type of elationship that exists between the various performance metrics which affect the quality of the Cloud and more specifically Cloud applications. This work uses the Taguchi method for the design of experiments to propose a methodology for identifying the relationships between the various configuration parameters that effect the quality of Cloud Computing performance in Hadoop environments. This paper is based on a proposed performance measurement framework for Cloud C...

2014 Joint Conference of the International Workshop on Software Measurement and the International Conference on Software Process and Product Measurement, 2014
Measuring the performance of cloud computing-based applications using ISO quality characteristics... more Measuring the performance of cloud computing-based applications using ISO quality characteristics is a complex activity for various reasons, among them the complexity of the typical cloud computing infrastructure on which an application operates. To address this issue, the authors use Bautista's proposed performance measurement framework [1] on log data from an actual data center to map and statistically analyze one of the ISO quality characteristics: time behavior. This empirical case study was conducted on an industry private cloud. The results of the study demonstrate that it is possible to use the proposed performance measurement framework in a cloud computing context. They also show that the framework holds great promise for expanding the experimentation to other ISO quality characteristics, larger volumes of data, and other statistical techniques that could be used to analyze performance.
Computer Communications and Networks, 2013
Cloud Computing is an emerging technology for processing and storing very large amounts of data. ... more Cloud Computing is an emerging technology for processing and storing very large amounts of data. Sometimes, anomalies and defects affect a part of the Cloud infrastructure, resulting in a degradation in Cloud performance. This work uses the Taguchi method for the design of experiments to present a methodology for identifying the relationships between the various configuration parameters that affect the quality of Cloud Computing application performance. This chapter is based on a proposed performance measurement framework for Cloud Computing systems, which integrates software quality concepts from ISO 25010 and other international standards.

Journal of Cloud Computing, 2014
The foundation of Cloud Computing is sharing computing resources dynamically allocated and releas... more The foundation of Cloud Computing is sharing computing resources dynamically allocated and released per demand with minimal management effort. Most of the time, computing resources such as processors, memory and storage are allocated through commodity hardware virtualization, which distinguish cloud computing from others technologies. One of the objectives of this technology is processing and storing very large amounts of data, which are also referred to as Big Data. Sometimes, anomalies and defects found in the Cloud platforms affect the performance of Big Data Applications resulting in degradation of the Cloud performance. One of the challenges in Big Data is how to analyze the performance of Big Data Applications in order to determine the main factors that affect the quality of them. The performance analysis results are very important because they help to detect the source of the degradation of the applications as well as Cloud. Furthermore, such results can be used in future resource planning stages, at the time of design of Service Level Agreements or simply to improve the applications. This paper proposes a performance analysis model for Big Data Applications, which integrates software quality concepts from ISO 25010. The main goal of this work is to fill the gap that exists between quantitative (numerical) representation of quality concepts of software engineering and the measurement of performance of Big Data Applications. For this, it is proposed the use of statistical methods to establish relationships between extracted performance measures from Big Data Applications, Cloud Computing platforms and the software engineering quality concepts. a Corresponding values for HD bytes read and Memory utilization. b Corresponding values for the set of experiments 5 to 12 of trial 1. c Corresponding values for the set of experiments 5 to 12 of trial 2. d Corresponding values for the set of experiments 5 to 12 of trial 3. Bautista Villalpando et al.
Computer Communications and Networks, 2014
Cloud computing is a technology aimed at processing and storing very large amounts of data which ... more Cloud computing is a technology aimed at processing and storing very large amounts of data which is also known as Big Data. One of the areas that have contributed to the analysis of Big Data is Data Science. This new study area is called Big Data Science (BDS). One of the challenges on implementing BDS into organizations is the current lack of information which helps to understand BDS. Thus, this chapter presents a framework to implement Big Data Science in organizations which describe the requirements and processes necessaries for such implementation.
Journal of Software Engineering and Applications, 2012
Cloud Computing is an emerging technology for processing and storing very large amounts of data. ... more Cloud Computing is an emerging technology for processing and storing very large amounts of data. Sometimes anomalies and defects affect part of the cloud infrastructure, resulting in a performance degradation of the cloud. This paper proposes a performance measurement framework for Cloud Computing systems, which integrates software quality concepts from ISO 25010.

research.edm.uhasselt.be
One of the most relevant subjects for the intellectual formation of elementary and high-school st... more One of the most relevant subjects for the intellectual formation of elementary and high-school students is Mathematics where its importance goes back to ancient civilizations and which its importance is underestimated nowadays. This situation occurs in Mexico, where 69% of the total population of elementary school students between the third and sixth grade have insufficient level of mathematics knowledge, and this has resulted in the need to use a new mechanism to complement student's classroom learning. This work proposes the use of an Educational Videogame, where the first part of this proposal is a mobile suite of videogames for teaching mathematics and the second is a recommender system, it allows to students to reach contents according to their needs, this is achieved through a core engine that infers from an initial profile that covers the requirements of each user.
Hadoop is a set of utilities and frameworks for the development and storage of distributed applic... more Hadoop is a set of utilities and frameworks for the development and storage of distributed applications in cloud computing, the core component of which is the Hadoop Distributed File System (HDFS). NameNode is a key element of its architecture, and also its "single point of failure". To address this issue, we propose a replication mechanism that will protect the NameNode data in case of failure. The proposed solution involves two distinct components: the creation of a BackupNode cluster that will use a leader election function to replace the NameNode, and a mechanism to replicate and synchronize the file system namespace that is used as a recovery point.
Proceedings of the 2nd ACM SIGCHI symposium on Engineering interactive computing systems - EICS '10, 2010
In this position paper, I outline the research methods and approach our group at the Social Game ... more In this position paper, I outline the research methods and approach our group at the Social Game Lab takes to understanding and innovating successful social and emotional game and virtual world experiences.
Uploads
Papers by Luis Villalpando