Theory of Computation - Manuel Mazzara


Fall 2014 (material to appear here, for IU students now available on Dropbox)




This course will investigate the fundamentals behind compilers functioning. Although the act of compilation appears deceptively simple to most of the modern developers, great minds and results are behind the major achievements that made this possible. All starts with the Epimenides paradox (about 600 BC), which emphasizes a problem of self-reference in logic and brings us to the short time window between WWI and WW2 when, in 1936, Alan Turing proved that a general procedure to identify algorithm termination simply does not exist. Another major milestone has been reached by Noam Chomsky in 1956 with his description of a hierarchy of grammars. In this long historical timeframe we can put most of the bricks with which we build modern compilers. The course will be an historical tour through the lives of some of the greatest minds who ever lived on this planet.




A good software developer ignorant of how the mechanics of a compiler works is not better than a good pilot when it comes to fix the engine and he will definitively not able to provide more than average solutions to the problems he is employed to solve. Like automotive engineering teach us, races can only be won by the right synergy of a good driving style and mechanics. Most importantly, limits of computation cannot be ignored in the same way we precisely know how accelerations, forces and frictions prevent us from racing at an unlimited speed.


Students will learn how:


1. Compilers work behind the scenes

2. Some History of computing and its theory and major personalities

3. Limits of computation, i.e. what computers cannot do

4. What is tractability of a problem


Assessment mechanism


Students will be assigned papers or chapter by the J. Hromkovic book and will be asked to present a seminars. This will account for 1/3 of the evaluation. There will then be a mid-term Exam and Final Exam accounting for the remaining 2/3 (exam material attached separately).


Reference Material:


- J.E.Hopcroft and J.D.Ullman. Introduction to Automata Theory, Languages, and Computation. Addsion Wesley (1979). - M.Davis, R. Sigal and E.J. Weyuker. Computability, complexity, and languages: fundamentals of theoretical computer science. 2nd ed., Academic Press (1994). - J. Hromkovic. Algorithmic Adventures: From Knowledge to Magic. Springer (2009)

- Lecturing slides will be provided

Distributed and Outsourced

Software Engineering - Manuel Mazzara


Fall 2014 (material to appear here, for IU students now available on Dropbox)




This course explores the offshoring phenomenon from a technical software engineering perspective, providing a set of guidelines for making outsourced projects succeed, through both management approaches (in particular the CMMI) and technical solutions in areas of requirements, specification, design, documentation and quality control. The presentation is based on experience of outsourcing at ABB and other companies. The participants will take part in a case study exploring techniques for making an offshored project succeed (or recover from problems).This course provides students with a clear view of the offshore software development phenomenon, enabling them to participate successfully in projects outsourced partially or totally, and also helping them define their own career strategies in the context of outsourcing's continued growth.




Good software engineers represent a role of crucial important in software industry. However, many of them do not have a proper background to be effective in a team, especially when this is geographically distributed.


Students will learn how:


1. to be effective requirements engineers

2. to design efficient object oriented systems

3. to develop quality object oriented code

4. to work in a geographically distributed team

5. to understand the need for and the method of outsourcing

6. to grasp the basic of agile development


Assessment mechanism


Assessment is entirely based on the project to be submitted, in coordination with the other universities participating, before Christmas 2014. The project will be evaluated according to:

- Correct implementation of requirements

- Architectural Design

- Functionalities

- Usability

- Performances

- Effectiveness of teamwork (remote and local)


Reference Material:


- Bertrand Meyer. Object-Oriented Software Construction (2nd edition). Prentice Hall (1997);

- Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley (1995);

- Lecturing slides will be provided

Dependability and performance evaluation

of computer systems - Salvatore Distefano


Fall 2014 (material to appear here, for IU students now available on Dropbox)




The course covers the basics of the actual data center architectures, ranging from the analysis of the single components to the global infrastructure.




To analyze a modern enterprise data center, focusing on the technologies and on the main components, such as computing, memory, storage, and network systems.


Students will have:


1.Capacity in planning of an IT infrastructure


3.Performance in evaluation



Assessment mechanism


Students will be given assignments which will account for 30% of the evaluation. There will then be a mid-term Exam accounting 30% and Final Exam accounting 40% of the evaluation.




- Kishor S. Trivedi. Probability and Statistics with Reliability, Queuing, and Computer Science Applications, John Wiley and Sons, New York, 2001. ISBN number 0-471-33341-7


Reference Material:


- Ananth Grama, George Karypis, Vipin Kumar, Anshul Gupta. Introduction to Parallel Computing, 2/E. ISBN-10: 0201648652 • ISBN-13: 9780201648652, 2003 Addison-Wesley, 656 pp

- Edward D. Lazowska, John Zahorjan, G. Scott Graham, Kenneth C. Sevcik. Quantitative System Performance Computer System Analysis Using Queueing Network Models (http://homes.cs.washington.edu/~lazowska/qsp/).


Component-based Software

Engineering - Manuel Mazzara


Fall 2015




In this course, we will understand how CBSE incorporates concepts from OOP and how UML can be used for semiformal specification and design. The course will also present the most established technologies from simple RPC and RMI to more complex CORBA middleware, .NET and Web Services. We will also discuss Service Oriented Computing and Cloud Computing. The primary focus of this course is exposing the students to the fundamental concepts of component-based software and of the several existing components models, rather than train them to use a specific technology.




Component-based development is a crucial skill for software engineers aiming at finding employment in software industry, either at the national or international level. This course aims at providing the fundamentals for students to learn new technologies and to be able to evaluate them when it comes to technological design choices.


Students will understand:


1. the importance of component-based development and software services

2. the differences between component-based software development and the traditional one

3. the foundations of component-based software

4. the current tendencies in software architectures

5. the functioning of specific commercial frameworks for component-based software development


Assessment mechanism


Students will be given 3 assignments (requirements, design, coding) and will be asked to report during lab sessions. This will account for 1/3 of the evaluation. There will then be a mid-term Exam and Final Exam accounting for the remaining 2/3 (exam material attached separately).


Reference Material:


- C. Szyperski. Component Software: Beyond Object-Oriented Programming. 2nd ed. Addison-Wesley Professional (2002). - G.T. Heineman and W.T. Councill. Component-Based Software Engineering: Putting the Pieces Together. Addison-Wedley Professional (2001).

- Lecturing slides will be provided