Nuestra propuesta se basa en definir plantillas genéricas para la descripción de Casos de Uso, y ... more Nuestra propuesta se basa en definir plantillas genéricas para la descripción de Casos de Uso, y a partir de ellas, definir plantillas para las etapas de Análisis y Diseño.
En el día a día, el ingeniero de software se enfrenta con problemas que pertenecen a diferentes c... more En el día a día, el ingeniero de software se enfrenta con problemas que pertenecen a diferentes contextos de aplicación pero presentan un comportamiento similar. Situaciones en las que el ingeniero debe esforzarse por plantear soluciones genéricas, que sean instanciadas y den solución a problemas específicos similares.
⎯ The Design Patterns constitute one of the innovations of more impact on the oriented-objects de... more ⎯ The Design Patterns constitute one of the innovations of more impact on the oriented-objects development, and their use is more and more required by the software engineers. For this reason, the use of design patterns is considered a basic ability that should acquire the students in Computer Sciences. In this work has been developed a framework for the development of final works of the computation careers, applying different design patterns (mediator, dates transfer object, registry) in the layers of the system. This framework allows to work with distributed objects applying concepts like persistence, remote procedure call (RPC) and client-server, and its construction is the result of the evolution of the generic template for the description of use cases.
Trace slicing is a widely used technique for execution trace analysis that is effectively used in... more Trace slicing is a widely used technique for execution trace analysis that is effectively used in program debugging, analysis and comprehension. In this paper, we present a backward trace slicing technique that can be used for the analysis of Rewriting Logic theories. Our trace slicing technique allows us to systematically trace back rewrite sequences modulo equational axioms (such as associativity and commutativity) by means of an algorithm that dynamically simplifies the traces by detecting control and data dependencies, and dropping useless data that do not influence the final result. Our methodology is particularly suitable for analyzing complex, textually-large system computations such as those delivered as counter-example traces by Maude model-checkers.
Universidad Nacional de Río Cuarto Facultad Cs. Exactas, Fco-Qcas y Naturales Te: (0358) 4676226/... more Universidad Nacional de Río Cuarto Facultad Cs. Exactas, Fco-Qcas y Naturales Te: (0358) 4676226/235. Fax: (0358) 4676530 marcela@dc.exa.unrc.edu.ar Resumen
Fourth IEEE International Conference on Software Engineering and Formal Methods (SEFM'06), 2006
The development and maintenance of Web sites are difficult tasks. To maintain the consistency of ... more The development and maintenance of Web sites are difficult tasks. To maintain the consistency of ever-larger, complex Web sites, Web administrators need effective mechanisms that assist them in fixing every possible inconsistency. In this paper, we present a novel methodology for semi-automatically repairing faulty Web sites which can be integrated on top of an existing rewriting-based verification technique developed in a previous work. Starting from a categorization of the kinds of errors that can be found during the Web verification activities, we formulate a stepwise transformation procedure that achieves correctness and completeness of the Web site w.r.t. its formal specification while respecting the structure of the document (e.g. the schema of an XML document). Finally, we shortly describe a prototype implementation of the repairing tool which we used for an experimental evaluation of our method.
Web-TLR is a software tool designed for model-checking Web applications which is based on rewriti... more Web-TLR is a software tool designed for model-checking Web applications which is based on rewriting logic. Web applications are expressed as rewrite theories which can be formally verified by using the Maude built-in LTLR model-checker. Web-TLR is equipped with a userfriendly, graphical Web interface that shields the user from unnecessary information. Whenever a property is refuted, an interactive slideshow is generated that allows the user to visually reproduce, step by step, the erroneous navigation trace that underlies the failing model checking computation. This provides deep insight into the system behavior, which helps to debug Web applications.
2nd International Workshop on Automated Specification and Verification of Web Systems (WWV'06), 2006
The development and the maintenance of Web sites are difficult tasks. To maintain the consistency... more The development and the maintenance of Web sites are difficult tasks. To maintain the consistency of everlarger, complex Web sites, Web administrators need effective mechanisms that aid them in fixing every possible inconsistency. In this paper, we present an extension of a methodology for semiautomatically repairing faulty Web site which we developed in a previous work. As a novel contribution, we define two correction strategies with the aim of increasing the level of automation of our repair method. Specifically, the proposed strategies minimize both the amount of information to be changed and the number of repair actions to be executed in a faulty Web site to make it correct.
2nd International Workshop on Automated Specification and Verification of Web Systems (WWV'06), 2006
In this paper, we present a simple, easy-to-use, rewritinglike methodology for filtering informat... more In this paper, we present a simple, easy-to-use, rewritinglike methodology for filtering information in an XML document. Essentially, we define a specification language which allows one to extract relevant data (positive filtering) as well as to exclude useless and misleading contents (negative filtering) from a set of XML documents according to some given criteria. We believe that our methodology achieves the right tradeoff between expressive power and simplicity of use, and thus it may be also fruitfully employed by those users who typically prefer to avoid formal languages.
In this paper, we present a trace slicing technique for rewriting logic that is suitable for anal... more In this paper, we present a trace slicing technique for rewriting logic that is suitable for analyzing complex, textually-large system computations in rewrite theories that may contain conditional equations and/or rules. Given a conditional execution trace T and a slicing criterion for the trace (i.e., a set of positions that we want to observe in the final state of the trace), we traverse T from back to front, and at each rewrite step, we incrementally compute the origins of the observed positions, which is done by inductively processing the conditions of the applied equations and rules. During the traversal, we also carry a boolean compatibility condition that is needed for the executability of the processed rewrite steps. At the end of the traversal, the trace slice is obtained by filtering out the irrelevant data that do not contribute to the criterion of interest.
In this paper, we present the rewriting-based, Web verification service WebVerdi-M, which is able... more In this paper, we present the rewriting-based, Web verification service WebVerdi-M, which is able to recognize forbidden/incorrect patterns and incomplete/missing Web pages. WebVerdi-M relies on a powerful Web verification engine that is written in Maude, which automatically derives the error symptoms. Thanks to the AC pattern matching supported by Maude and its metalevel facilities, WebVerdi-M enjoys much better performance and usability than a previous implementation of the verification framework. By using the XML Benchmarking tool xmlgen, we develop some scalable experiments which demonstrate the usefulness of our approach.
The Journal of Logic and Algebraic Programming, 2013
Keeping XML data in a consistent state w.r.t. both structure and content is a burdensome task. To... more Keeping XML data in a consistent state w.r.t. both structure and content is a burdensome task. To maintain the consistency of ever-larger, complex XML repositories, suitable mechanisms that are able to fix every possible inconsistency are needed. In this article, we present a methodology for semi-automatically repairing faulty XML repositories that can be integrated on top of an existing rewriting-based verification engine. As a formal basis for representing consistency criteria, we use a rule-based description formalism that is realized in the language Maude. Then, starting from a categorization of the kinds of errors that can be found during the verification process, we formulate a stepwise transformation procedure that achieves correctness and completeness of the XML repository w.r.t. its Maude formal specification while strictly observing the structure of the XML documents. With the aim of increasing the level of automation of our repair methodology, we also define two correction strategies and two completion strategies that reduce either the amount of information to be changed or the number of repair actions to be executed in order to deliver an XML repository that is both correct and complete. Finally, we describe a prototype implementation of the repairing tool, which we use for an experimental evaluation of our method with good results.
Electronic Proceedings in Theoretical Computer Science, 2011
WEB-TLR is a Web verification engine that is based on the well-established Rewriting Logic-Maude/... more WEB-TLR is a Web verification engine that is based on the well-established Rewriting Logic-Maude/LTLR tandem for Web system specification and model-checking. In WEB-TLR, Web applications are expressed as rewrite theories that can be formally verified by using the Maude built-in LTLR model-checker. Whenever a property is refuted, a counterexample trace is delivered that reveals an undesired, erroneous navigation sequence. Unfortunately, the analysis (or even the simple inspection) of such counterexamples may be unfeasible because of the size and complexity of the traces under examination. In this paper, we endow WEB-TLR with a new Web debugging facility that supports the efficient manipulation of counterexample traces. This facility is based on a backward trace-slicing technique for rewriting logic theories that allows the pieces of information that we are interested to be traced back through inverse rewrite sequences. The slicing process drastically simplifies the computation trace by dropping useless data that do not influence the final result. By using this facility, the Web engineer can focus on the relevant fragments of the failing application, which greatly reduces the manual debugging effort and also decreases the number of iterative verifications.
Uploads
Papers by Daniel Romero