Вы находитесь на странице: 1из 76

2012 september VOL. 3, NO.

Parallel Computing: Thoughts after A Tour of Academic Outreach The Value Proposition Of Distance Education A Freshman-Level Course In Information Assurance

The Role of Practical Projects in IS Courses An Alternative View of Science Dynamic Programming Of the Towers

CRC_GVL13_Inroads Ad_GVL13 7/6/12 9:54 AM Page 1

Catalog no. K11981 August 2012, 515 pp. ISBN: 978-1-4398-4469-4 $99.95 / 49.99 Also available as an eBook

Catalog no. K14146, July 2012 555 pp., Soft Cover ISBN: 978-1-4398-9251-0 $79.95 / 44.99 Also available as an eBook

Catalog no. K14273, October 2012 c. 936 pp., Soft Cover ISBN: 978-1-4398-9666-2 $79.95 / 44.99 Also available as an eBook

For a textbook evaluation copy, visit www.crctextbooks.com If purchasing a copy for personal use, use promo code GVL13 when ordering at www.crcpress.com to SAVE 25%.
Catalog no. K14501, May 2012 315 pp., Soft Cover ISBN: 978-1-4665-0396-0 $69.95 / 44.99 Also available as an eBook Catalog no. K11207, July 2012 744 pp., Soft Cover ISBN: 978-1-4398-2534-1 $89.95 / 57.99 Also available as an eBook

Se e w ha ty ou re m is si . ng
s W ffe r N O a il o c o m U P E em ess. G N IV p r S I C L U S .c r c X ww rE w fo a t

W W W. C R C P R E S S . C O M

CRC Press

Taylor & Francis Group

CONTENTS
volume 3, number 3 2012 september

editors corner

3 message
By John Impagliazzo

31 Community College Corner Communicating and


Collaborating with Colleagues Online By Elizabeth K. Hawthorne Math CountS Alan Turing Mathematician/Computer Scientist? By Peter B. Henderson

32

Critical Perspectives
Four-year Tour of Academic Outreach By Michael Wrinn

4 Parallel Computing: Thoughts Following a

34

Colorful Challenges Rectangle Cover By David Ginat

Edubits
By Andrew McGettrick and Yan Timanovsky

bits & bytes

1 0 Digest of ACM Educational Activities


Featured Columns

3 6 Demonstrating Random and Parallel Algorithms with Spin


By Mordechai (Moti) Ben-Ari and Fatima Kaloti-Hallak

standard articles

Taking the High Road Some Surprising Social Impact Scenarios By C. Dianne Martin Reflections Data Science Overtakes Computer Science? By Deepak Kumar IS Education Essential Role of Practical Projects in Information Systems Courses By Heikki Topi Classroom Vignettes Course Planning: The Day-byDay Course Schedule By Henry M. Walker Computing Education Research A Variation on Kvales One Thousand Page Question By Raymond Lister Distance Education Whats the Value Proposition of Distance Education? By Marian Petre and Mary Shaw

16

4 0 Dynamic Programming of the Towers


By Timothy Rolfe A Day One Computing for the Social Good Activitys By Michael Goldweber

18

46

20 22

comprehensive articles

5 0 A Freshman Level Course on Information Assurance:


Can It Be Done? Heres How By Robin Gandhi, Connie Jones, and William Mahoney

6 2 CS1 with Games and an Emphasis on TDD and Unit


Testing: Piling a Trend Upon a Trend By Ville Isomttnen and Vesa Lappalainen

24 26

sigcse spotlight

71 Custom textbooks from publishers, Rene McCauley and


SIGCSE elections, Study time and CS students. By Curt White

2 9 Percolations An Alternate View of Science


By Lisa Kaczmarczyk

Cover photograph and this page: w w w.iStockphoto.com / nadl a 

2012 September Vol. 3 No. 3 acm Inroads 1

volume 3, number 3 2012 september

ACM Inroads
Editor-in-Chief

A Quarterly Magazine of ACM

ACM Publications
ACM Publication Board
Co-Chairs: Ronald F. Boisvert and Jack Davidson Board Members: Marie-Paule Cani ; Nikil Dutt; Carol Hutchins; Joseph A. Konstan; Ee-peng Lim; Catherine McGeoch; M. Tamer Ozsu; Vincent Shen; Mary Lou Soffa

John Impagliazzo Emeritus, Hofstra University, Hempstead, New York, 11549 USA

Associate Editors

Alison Clear; Tony Clear; Judith Gal-Ezer; Lisa Kaczmarczyk; Deepak Kumar; Henry Walker

Publications Office

Editorial Advisory Board

Vicki Almstrum; Moti Ben-Ari; Anders Berglund; David Berque; Judith Bishop; Steve Cooper; Wanda Dann; Mike Erlinger; Leslie Fife; Sue Fitzgerald; Mike Goldweber; Brian Hanks; Joseph Kmoch; Tami Lapidot; Lauri Malmi; Amber Settle; Fran Trees; Paul Tymann

ACM, 2 Penn Plaza, Suite 701 New York, New York 10121-0701 USA Tel: +1-212-869-7440 Fax: +1-212-869-0481 e-only: $30 e-only: $11 e-only: $68 p+e: $ 44 p+e: $ 17 p+e: $ 102

Annual Subscriptions
Members print: $37 Students print: $14 Non-members: $85

Single copies
$ 9 $ 4 $ 25

Columnists

Michal Armoni; Tony Clear; David Ginat; Don Gotterbarn; Elizabeth Hawthorne; Peter B. Henderson; Lisa Kaczmarczyk; Deepak Kumar; Raymond Lister; C. Dianne Martin; Marian Petre; Jeffrey Popyack; Heikki Topi; Henry M. Walker

SIGCSE members receive ACM Inroads as a membership benefit. Please send orders to ACM, General Post Office, P.O. Box 30777 New York, New York 10087-0777 USA or call +1-212-626-0500 For credit card orders, call +1-800-342-6626 Order personnel available 08:30-16:30 EST After hours, please leave message and order personnel will return your call.

SIGCSE Trend Tracker


Curt M. White

Art Director

Robert Vizzini

Editorial Consultant
Diane Crawford

Change of Address
acmcoa@acm.org

Production Manager
Adrienne Griscti Julie Goetz

Other Services, Questions, or Information


acmhelp@acm.org

Production Assistant Web Administrator


Amber Settle

ACM Inroads Copyright Notice

Website

http://inroads.acm.org http://mc.manuscriptcentral.com/inroads

Author Information

Publication Information

Copyright 2012 by Association for Computing Machinery, Inc. (ACM). Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee.

ACM Inroads is published four times a year: Print (ISSN 2153-2184) Online (ISSN 2153-2192) March; June; September; December by ACM

Request Permission to Publish

Publications Department, ACM, Inc. Fax +1-212-869-0481 or email permissions@acm.org For other copying of articles that carry a code at the bottom of the first or last page or screen display, copying is permitted provided that the per-copy fee indicated in the code is paid through: Copyright Clearance Center 222 Rosewood Drive Danvers, Massachusetts 01923 USA +1-978-750-8400 (Tel) +1-978-750-4470 (Fax) Periodicals postage paid in New York, New York 10001 USA and at additional mailing offices. Postmaster: Please send address changes to: ACM Inroads ACM 2 Penn Plaza, Suite 701 New York, New York 10121-0701 USA

Editorial Information

Contact ACM Inroads via email to the EIC at acminroads@gmail.com

ACM Inroads Advertising Department

Director of Media Sales: Jennifer Ruzicka jen.ruzicka@hq.acm.org ACM, 2 Penn Plaza, Suite 701 New York, New York 10121-0701 USA (212) 626-0686 (Tel) (212) 869-0481 (Fax)

Acknowledgment

The volunteers and staff of ACM Inroads wish to thank the ACM Special Interest Group on Computer Science Education (SIGCSE). Its support helps make the magazine publication and distribution possible.

2 acm Inroads 2012 September Vol. 3 No.3

Editors Corner
John Impagliazzo
Emeritus, Hofstra University

elcome to the 2012 September issue of ACM Inroads, ACMs computing education magazine. As always, we are grateful to our authors; every section of the magazine reflects their tireless efforts and commitment. The Featured Columns, for example, form the foundation of the magazine; the columnists are leading voices in the field and they share their experiences, observations, and opinions with us. The EduBits section highlights educational happenings within ACM, the ACM Education Board, and the ACM Education Council. SIGCSE Spotlight shines on special interest group news from the magazines global community. The Spotlight, EduBits, and the Featured Columns are quarterly fixtures of the magazine. Therefore, readers are welcome to contact our SIGCSE Trend Tracker Curt White, our EduBits coordinators Andrew McGettrick (Chair of the ACM Education Board) and Yan Timanovsky (ACM Education Manager), and the dedicated authors of the Featured Columns with your thoughts. They would love to hear from you. As Editor-in-Chief of this magazine, I have written on several occasions about the nature and purpose of ACM Inroads. However, some confusion remains among readers. The following should help dispel some of the misunderstandings.

ACM Inroads is a magazine. It is not an ACM journal. It is not an ACM newsletter. ACM Inroads is  one of eight magazines published by the Association. For example, Communications of the ACM, the flagship publication of ACM, is another such magazine. ACM Inroads publishes articles, not research papers. As a magazine, articles should be concise,  interesting, informative, yet substantive. Authors should write articles in a more informal, generalinterest style with an appropriate number of colorful high-resolution diagrams, images, and figures. ACM Inroads is a computing education publication. Its focus includes all areas of computing such  as information systems, computer engineering, computer science, information technology, software engineering, and other related fields. Its focus is not just computer science. The magazine encourages and seeks education-related articles from all computing areas. ACM Inroads submissions undergo a strict review process via Scholar Ones Manuscript  Central. Articles such as Bits & Bytes, Standard, and Comprehensive types must pass a double blind, formal peer review scrutiny before publication. ACM Inroads is an ACM publication; it is not a SIG publication. The magazine is under the  jurisdiction of the ACM Publications Board even though the publication derived its roots from the SIGCSE community. SIGCSE helps support the publication and it provides the magazine as a benefit to its members.

These facts should remove remove any confusion that seems to surround the nature and purpose of the magazine. I trust you will enjoy reading this issue of ACM Inroads. The topics presented should interest even the casual reader. Thanks again to all authors and readers who help make this an outstanding magazine.

John Impagliazzo
Editor-in-Chief
DOI: 10.1145/2339055.2339056 Copryright held by author.

2012 September Vol. 3 No. 3 acm Inroads 3

cri t i c a l p e r s p e c t i v e s

Michael Wrinn, Ph.D.


Research Program Director University Research Office Intel Corporation 2111 NE 25th Avenue Hillsboro, Oregon 97124 USA michael.wrinn@intel.com

Parallel Computing: Thoughts Following a


Four-year Tour of Academic Outreach
1. Background Some years ago, recognizing the need to bring computing curricula into line with its multicore roadmap change, Intel initiated an Academic Community [16] effort to assist and accelerate the adoption of parallel computing techniques into undergraduate computing education and in particular, computer science education. I was privileged to play a part in the effort. It is a good time to pause and take stock of where things stand.
My own engagement began at the 2008 SIGCSE Technical Symposium, delivering a vendor session talk, Confronting Manycore.Although the more modest dual cores were not yet ubiquitous, I was trying to look ahead on the potential impacts on software development, and by implication, to computing education. While the audience was friendly, I did encounter resistance, manifested as:  Surprise (Is parallel computing already here?)  Complacency (Parallel computing might be important sometime in the future), and  Skepticism (Parallel computing is not important, now or in the future). In a very short time, though certainly by SIGCSE Symposium in 2009 these responses had given way to: Whats the recommended model? Do we introduce this early or late?  Where are the textbooks? and (discretely) How do I learn this myself? We still wrestle these questions today. To provide context, lets do a quick review of the situation.CPU architectures had reached a variety of limits on design features, which had served so well through earlier periods. Strategies such as instruction-level parallelism, out-of-order execution, caches, and especially frequency scaling (and thus power considerations) had all reached levels where further progress had reached either an outright wall (power) or plateaus of diminishing return. Yet, transistor budgets continued (and still continue) their relentless exponential march of Moores-law doublings. So, what are we to do with this bounty? The software performance ramp had stalled, and application performance was no longer realizing a Moores dividend [19]. How could we restore this ramp? The most obvious idea was to replicate the core execution units, and push the responsibility for application performance back to the software community. Alternatively, as I sometimes explained to my software developer friends, the hardware side of the industry has perpetrated a practical joke on their software counterparts, sliding the sequential-model rug out from under halfcentury of software design. Expertise in parallel computing long preceded this multicore shift, of course. Those with acute performance requirements - modelers for purposes scientific, financial, seismic, computer-generated imagery (CGI) animation and more were already steeped in the lore, successfully employing parallel computation as normal part of work. What was new was the leap of faith: for domains beyond traditional high performance computing (HPC), would this potentially new performance even matter?

2. The Risk of Sufficiency


It can be sobering to consider the experience of another exciting high-tech industry of the twentieth century, namely, aviation, and specifically, general aviation. Half a century after the first powered flight, light aircraft had achieved a level of reliability and (relative) ease-of-use that people of normal middle-class means could choose to pursue piloting. Industry produced aircraft in large numbers to address this anticipated postwar market. To a professional in the field, the 1950s presented a heady mix of high-end progress (e.g. the first passenger jet travel, supersonic military craft) and general accessibility (e.g. Cessna and many others). Jumping forward almost sixty years to the present time, we find this description remains unchanged. We still have passenger jet travel, supersonic military craft, and general accessibility for aspiring pilots.

4 acm Inroads 2012 September Vol. 3 No. 3

critical perspectives

While the refinements achieved over this period are remarkable, particularly at the high-end and in avionics, that 1956 Cessna 172 looks and flies essentially the same as its 2012 counterpart. The basic design had reached a level of sufficiency, and would not did not need to - appreciably advance beyond that point. How many aeronautical engineering graduates from the class of 1956 comprehended that their chosen profession had already reached this plateau? Might this example foretell the fate of this years entering computing graduates, class of 2016?

3.  Industry/Academic Collaborations
It was clearly in the computing industrys best interest to avoid hitting such a plateau, so education efforts at all levels were launched, from professional workshops on

new tools, to broad support for curriculum enhancements at undergraduate institutions. The Intel Academic Community was launched to address the latter. Member benefits include access to content, blogs, expert advice, and tools licenses for classroom use. As well, micro-grants were established for specific course proposals, the content later shared in an online repository. A Manycore Testing Lab [17], a set of nicely equipped 40-core servers, was placed online for experimenting and teaching about highly-scalable parallel challenges. Enrollment in this community implied a commitment to introduce parallel computing into the curriculum. By mid-2011, we had recruited over 5000 professors at 2800 universities worldwide (99 countries in all). Figure 1 shows graphically the distributions, with circle sizes in proportion to university participation. We found particularly good resonance in the developing world; in aggregate, 80% of participants came from outside the United States.

Germany (92) Russian (91)

India (604)

France (54) Mexico (83)

United Kingdom (86) United States (604)

Other computing companies initiated analogous programs, but many of us recognized a common interest; we all shared the risk of sufficiency, and all would benefit from a parallel-capable workforce. To this end was created the Educational Alliance for a Parallel Future (EAPF), with members spanning several industry producers (Intel, AMD, NVidia, Microsoft), commercial stakeholders (Adobe, DreamWorks), research institutions (CERN), and universities at all levels, from R1 to liberal arts to community colleges (Georgia Tech, Illinois, Wuhan(China), Bristol (UK), Earlham, St Olaf, Contra Costa). EAPFs charter is quite focused: facilitate the inclusion of parallel programming topics in computational science curricula, ease and speed the technology transfer from research to education, and expedite sharing of materials and experiences [11]. One tangible outcome of EAPF is the ACM Teckpack on Parallel Computing. The first in this series focused on a technology tour, which approaches parallelism from the point of view of someone comfortable with programming but not yet familiar with parallel concepts. This Techpack, along with a second one currently in review, include example exercises (source code) and references for further study [2]. EAPF announced itself at the supercomputing conference SC08, in a Birds-ofa-Feather (BOF) session titled, There Is No More Sequential Programming, So Why Are We Still Teaching It, where we industry people gently berated our academic friends who were: still teaching sequential programming. This is true despite the fact that all major manufacturers have moved to a many core architecture and current generation CPU, GPU or ASIC designs cannot be efficiently programmed without knowledge of parallel programming. It was a plea, really, and the academic community has responded well. EAPF has offered a panel or BOF at every SC conference since that one, hammering on variations of this educational theme. The SC10 panel (now including actual university members), for example, addressed curriculum head on by stating [24]:

China (251)

Figure 1. Graphical distribution of recruitment

2012 September Vol. 3 No. 3 acm Inroads 5

cri t i c a l p e r s p e c t i v e s
Parallel Computing: Thoughts Following a Four-year Tour of Academic Outreach
In the face of ubiquitous parallel systems, what CS core topics need to change? What are the languages, models and patterns that will allow academia to demonstrate the opportunities inherent in the new platforms and teach them effectively to the next generation of technologist? a total of $20 million for two Universal Parallel Computing Research Centers (UPCRC), one at University of California, Berkeley (UC Berkeley), the other at University of Illinois at Urbana-Champaign (UIUC). The work has resulted in significant performance enhancements to a range of applications: an order-of-magnitude Anticipating this evolution, the IEEE Computer Society Technical Committee on Parallel Processing (TCPP) [15], with funding from the US National Science Foundation NSF and several industry stakeholders (Intel, IBM, NVidia), has since 2010 sponsored early adopter grants to encourage updates to computing courses

What are the new design considerations, from software architecture to low-level performance? What models encourage robust applications, and programmer productivity?
The messageat least the problem statement and its urgencybegan to resonate in the wider world, the panel garnering attention even in The Economist, which reported [9]: Parallel programming once an obscure academic problemfinding ways to make it easy to write software that can take full advantage of the power of parallel processingis rapidly becoming a problem for the whole industry. At SC10, a computing conference held in New Orleans in November 2010, experts discussed the need to change curricula and update textbooks to reflect the growing demand for parallel-programming skills in general-purpose computing, and not just in scientific computing. This will take years. speedup for continuous speech recognition [26], and a 100x speedup for image reconstruction for pediatric MRI, making it the first clinically feasible runtime for a compressed sensing MRI reconstruction [22]. Each application development of this kind adds to the growing portfolio teachable insights for parallel computing. As well, the design patterns research lead directly to an entirely new Berkeley undergraduate course (more on this below). Much remains to be learned, for example, in important non-numeric techniques such as graph and machinelearning algorithms important for big data. As research teams achieve new understanding, this can go immediately into updates to the computing curriculum. In this respect, computational science teaching begins to resemble fast-moving science fields (e.g. molecular genetics), where annual updates and revisions to course material are required, to keep pace with the discipline. to incorporate parallel thinking where appropriate.Top results are featured at the EduPar workshop of the annual IPDPS meeting, and are available for download at its website [10]. Chinas Ministry of Education (MOE) is undertaking a similar modernization of its national model curriculum, to be deployed in 2013; they are collaborating closely with CS2013 to achieve best outcome of quality and consistency. Several thousand academic departments, worldwide, will be impacted at some level by this wave of new guidelines. Will faculty be ready? CS2013 will be pointing to exemplars, existing courses that incorporate the new knowledge units. TCPP has been promoting development of new teaching material. Chinas MOE focuses on creating a set of model courses, developed at select universities and made available for broader use. In each case, the need for new material is comprehended, but adoption will take some effort.

4. The Role of Research


The unattributed author of that Economist article had it more right than perhaps he realized. Shifting the programming model for general-purpose computing will indeed take years, because it is not simply a matter of infusing known techniques in curricula and textbooks.Beyond HPC, the models are immature, unexplored, or unknown. What types of applications, presently unrealized or unimagined, might be enabled by orders-ofmagnitude more computing power? What are the new design considerations, from software architecture to low-level performance? What models encourage robust applications, and programmer productivity? To address these, in 2008 Intel and Microsoft committed

5.  Curriculum Guideline Efforts


A primary international source of guidance on curriculum design is the Computing Curricula volume on Computer Science, developed jointly by ACM and the IEEE Computer Society [1]. These recommendations are now being revised and enhanced to match latest developments; for the first time, they will include Parallel and Distributed as a required core category, with significant classroom hours. Deployment is scheduled for 2013, though a draft document is posted now [7].

6.  Curricula Considerations Some Opinions


Much of the early adoption effort has focused on extending the programming class material: implementing data-, loopor task-parallel constructs, preventing race conditions, and analyzing performance. Other departments revived dormant courses specifically on parallel computing, as an upper-level elective. Planning discussions revolve around those questions noted above, namely, when to introduce parallel computing, and what is the

6 acm Inroads 2012 September Vol. 3 No. 3

critical perspectives

best model (e.g. Java threads, OpenMP, Scheme, etc.)?There is also angst regarding what to omit in the zero-sum budget of course hours. My personal take: we should use whichever programming model best conveys key concepts. Introduce parallel computing early and often, in increments, and in every topic impacted by it (not just the programming classes). Lets not be tricked by a zero-sum false choice; we still cover algorithms and data structures, for example, but adapted to the new underlying architectures. Finally, we should focus on software design in a way that provides a template to prepare students for a rapidly evolving parallel and distributed computing landscape. Lets elaborate on each of these.

of dismay as we now regard blood-letting as a once-standard medical practice.

Infuse Parallelism at All Levels and Topics


Since parallelism now touches most of computation in some way, a rethinking of classic topics is called for [5]. Algorithms and data structures optimal for sequential systems may no longer be appropriate; do linked lists - inherently sequential - make sense in an era of parallel architectures? How would we redesign a parallel hash table? What are better implementations for standard sorting algorithms? At some of the leading research schools, core curriculum topics are being revised to respond to such considerations. In spring of this year, for example, Carnegie Mellon University launched a new course, Parallel and Sequential Data Structures and Algorithms [6]. For several years, the University of Washington has integrated parallel thinking into its course, Data Abstractions [14]. UC Berkeley has substantially revised its Machine Structures required course to address parallel computing [8]. On software performance (including power considerations), MIT continues to update its Performance Engineering of Software Systems [21] with parallel techniques. This growing portfolio points the way forward; all material cited is available for broader adoption.

Programming Model Choice: Keep It Simple; Dont Get Too Worried About Particulars
The range of languages and models successfully deployed in teaching environments is broad: from the functional (Scheme, Erlang, ML) to imperative (C, C++, Java). Either family offers (whether inherently or by extension) syntactically simple, high-level abstractions which enable parallel concepts to be implemented at all curriculum levels as a routine component of computation we do not need a special course in parallel programming. Lower-level APIs (OpenCL, PThreads, etc.), explicit messaging (MPI), and performance considerations (SIMD units, cache considerations, power reduction, etc.) can be introduced in more advanced courses. Please make students aware that threading is just one way to proceed, and brings dangerous side effects; dont let students think that parallel=threads. While the proliferation of useful threading models (e.g. OpenMP, Cilk, Threading Building Blocks) confers a kind default status to these methods, we should recall that the HPC expert community a tribe of genuine parallel computingexperts has long since reached consensus on a message-passing model.Edward Lees article, The Problem with Threads [20] quite succinctly outlines the issues, and they are serious. I suspect we will look back on this approach (threading) with the same level

The parallel design patterns approach informed research work mentioned above (speech recognition, pediatric MRI) and it was organized into a full-semester U.C. Berkeley undergraduate course called, Architecting Parallel Software. Other departments are beginning to adopt this approach as well [10]. I am optimistic about the design patterns work. Old enough to be present at the onset of object-oriented techniques, I recall it was regarded with a degree of suspicion and anxiety; nowadays, it is considered normal. Along the way, the design patterns work of Gamma, et al certainly helped to codify and tame the initially overwhelming onslaught of new ideas. Design patterns may provide, once again, a similar assistance for parallel computing. Figures 2 and 3 summarize the analogy.

7. Conclusion
Richard Feynman understood the implication of parallel computing earlier than many of us. From his 1985 Nishina Memorial Lecture [12]: One must start all over again with the problem, appreciating that we have the possibility of parallel calculation, and rewrite the program completely with a new understanding of what is inside the machine So whats going to happen is that the hard programs, vast big ones, will be the first to be reprogrammed by experts in the new way, and then gradually everybody will have to come around and programmers will just have to learn how to do it. This sounds right to me. That we all have to come around and just learn how to do it need not be dreaded as some ominous chore. Parallel computingincluding its broader manifestation in mobile/cloudpresents a chance to refresh our approaches to software design, and open application capabilities not yet imagined. This affects all areas of computing -- information systems professionals, computer engineers, software engineers, IT specialists, mobile application developers, etc) and it hasnt been this much fun in quite a while. Ir

Give Students a Flexible Toolkit Through Design Patterns


The best foundation for students may be the most general: principles of good software design, as expressed in patterns. Inspired by the civil architect Christopher Alexander [3] and pioneering work on patterns for object-oriented software [13], a synthesis is emerging. It combines structural patterns (e.g. Model-View Controller, Pipe and Filter) [25], computational patterns (e.g. structured grids, dense linear algebra) [4], and lower-level algorithmic patterns (e.g. divide and conquer, fork/join) [22] into a kind of software periodic table that provides a high-level framework for understanding (and improving) the mappings design to optimal implementation [18]. This project (curiously called Our Pattern Language) maintains a website with details and feedback opportunity [23].

2012 September Vol. 3 No. 3 acm Inroads 7

cri t i c a l p e r s p e c t i v e s
Parallel Computing: Thoughts Following a Four-year Tour of Academic Outreach

Software Design Patterns in education: lessons from history?


Early days OO
Perception: Object oriented? Isnt that just an academic thing? Usage: specialists only. Mainstream regards with indifference or anxiety. Performance: not so good. Now Perception: OO=programming Isnt this how it was always done? Usage: cosmetically widespread, some key concepts actually deployed.

1994

Performance: so-so, masked by CPU advances until now.

Figure 2. Software design patterns 1994

Copyright 2006, Intel Corporation. All rights reserved. Prices and availability subject to change without notice. *Other brands and names are the property of their respective owners

Software Design Patterns in education: lessons from history?


Now
Perception: Parallel programming? Isnt that just an HPC thing? Usage: specialists only. Mainstream regards with indifference or anxiety. Performance: very good, for the specialists. Future Perception: PP=programming Isnt this how it was always done? Usage: widespread, key concepts actually deployed.

2012

Performance: broadly sufficient. Application domains greatly expanded.

Figure 3. Software design patterns 2012

Copyright 2006, Intel Corporation. All rights reserved. Prices and availability subject to change without notice. *Other brands and names are the property of their respective owners

[5]  Blelloch. Guy, Parallel thinking, Proceedings of the 14th ACM SIGPLAN symposium on Principles and practice of parallel programming (PPoPP 09). ACM, New York, NY, USA, 1-2 [6]  Blelloch, Guy, Carnegie Mellon University, course 15-210: Parallel & Sequential Data Structures and Algorithms, Spring 2012, http://www.cs.cmu.edu/~15210/ [7]  Computer Science Curricula 2013, http://cs2013.org [8]  CS61C, Great Ideas in Computer Architecture (Machine Structures). Summer 2012, UC Berkeley, http://www-inst. eecs.berkeley.edu/~cs61c/su12/ [9]  The Economist, 2 June 2011 [10]  EduPar-12 Workshop, http://cs.gsu.edu/~tcpp/ curriculum/?q=advanced-technical-program [11]  Education Alliance for a Parallel Future, http://www.eapf. org/ [12]  Feynman, Richard P., The Computing Machines in the Future, Nishina Memorial Lectures, Lecture Notes in Physics, Volume 746. ISBN 978-4-431-77055-8. Nishina Memorial Foundation, 2008, p. 99 [13]  Gamma, Erich, with Richard Helm, Ralph Johnson, John Vlissides, Design patterns: elements of reusable objectoriented software, Addison-Wesley Longman Publishing Co., Inc. Boston, MA, USA 1995 [14]  Grossman, Dan, University of Washington, course CSE332: Data Abstractions, Spring 2012 http://www. cs.washington.edu/education/courses/cse332/12sp/ [15]  IEEE Computer Society Technical Committee on Parallel Processing http://tcpp.cs.gsu.edu/?q=content/welcomeieee-computer-society-technical-committee-parallelprocessing-tcpp [16]  Intel Academic Community, http://software.intel.com/ en-us/articles/intel-academic-community/ [17]  Intels Manycore Testing Lab, http://software.intel.com/ en-us/articles/intel-many-core-testing-lab [18]  Keutzer, Kurt, with Berna L. Massingill, Timothy G. Mattson, Beverly A. Sanders, A design pattern language for engineering (parallel) software: merging the PLPP and OPL projects, ParaPLoP 10, Proceedings of the 2010 Workshop on Parallel Programming Patterns, Article No. 9, ACM New York, NY, USA 2010 [19]  Larus, James, Spending Moores dividend, Communications of the ACM, Volume 52 Issue 5, May 2009. Pages 62-69. ACM New York, NY, USA [20]  Lee, Edward A.,The Problem with Threads, Computer, vol. 39, no. 5, pp. 33-42, May 2006, doi:10.1109/ MC.2006.180 [21]  Leiserson, Charles, MIT, course 6.172: Performance Engineering of Software Systems, Fall 2010 http://stellar. mit.edu/S/course/6/fa10/6.172/ [22]  Mattson, T.G., B. A. Sanders, B. L. Massingill. Patterns for Parallel Programming. Addison Wesley, 2004. [23]  Murphy, Mark, with Marcus Alley, James Demmel, Kurt Keutzer, Shreyas Vasanawala, Michael Lustig: Fast l1SPIRiT Compressed Sensing Parallel Imaging MRI: Scalable Parallel Implementation and Clinically Feasible Runtime. IEEE Trans.Med. Imaging 31(6): 1250-1262 (2012) [24]  Pattern Language for Parallel Programming ver2.0, http:// parlab.eecs.berkeley.edu/wiki/patterns/patterns [25]  SC10 panel, Preparing for Extreme Parallel Environments: Models for Parallelism, http://sc10.supercomputing.org/ schedule/event_detail.php?evid=stpan109 [26]  Shaw, M. and D. Garlan. Software Architecture: Perspectives on an Emerging Discipline. Prentice Hall, 1995. [27]  You, Kisun. with Jike Chong, Youngmin Yi, Ekaterina Gonina, Christopher Hughes, Wonyong Sung and Kurt Keutzer, Scalable HMM based Inference Engine in Large Vocabulary Continuous Speech Recognition, IEEE Signal Processing Magazine, March 2010 Categories and Subject Descriptors: D.1.3[Software] Programing Techniques, Concurrent Programming, Parallel programming; K.3.2 [Computers and Education]: Computer and Information Science Education Computer science education, Curriculum General Terms Experimentation, Design, Performance Keywords: Computer science education, computational thinking, CS principles DOI: 10.1145/2339055.2339057 2012 ACM 2153-2184/12/09 $15.00

Acknowledgements Many individuals contributed to the efforts described. Prominent among them, Ill thank my Intel coworkers Paul Steinberg (Academic Community manager, and prime mover in creating EAPF), Clay Breshears and Bob Chesebrough (authors of outstanding teaching material), and Tim Mattson (for constant pointers to teachable results from the research community). Continuing and enthusiastic support came from Professors Matt Wolf (Georgia Tech), Dick Brown (St Olaf), Tom Murphy (Contra Costa), Charlie Peck and students (Earlham), and Dan Ernst (Wisconsin, now at Cray). For inspiration and novel points of view, thanks go to Professors Kurt Keutzer and Dan Garcia (both at U.C. Berkeley).

References [1]  ACM Curricula Recommendations, http://www.acm.org/ education/curricula-recommendations [2]  ACM, Techpack on parallel computing, http://techpack. acm.org/parallel/ [3]  Alexander, Christopher (1977). A Pattern Language: Towns, Buildings, Construction. Oxford University Press, USA [4]  Asanovic, K. et al. The Landscape of Parallel Computing Research: A View From Berkeley. EECS Department, University of California, Berkeley, Tech. Rep. UCB/EECS2006-183. 2006.

8 acm Inroads 2012 September Vol. 3 No. 3

Visit the new Website for


The magazine for computing educators worldwide

ACM Inroads

http://inroads.acm.org
Paving the way toward excellence in computing education

ed u b i t s

By Andrew McGettrick and Yan Timanovsky

Digest of ACM Educational Activities


With global demand for affordable higher educationand practical training growing, especially in BRIC countries (Brazil, Russia, India, China) and other developing nations, technological advances and the open access movement are providing the impetus for scalable delivery of quality materials. Startups like Coursera and Udacity are adapting university-quality courses for mass delivery, while major academic institutions such as Harvard, MIT, and Stanford are launching their own online learning initiatives. This September 2012 edition of EduBits, the quarterly report from ACMs education community, spotlights online education as the ACM Education Council attempts to distill the daily buzz and excitement produced by this phenomenon to just the facts. In addition, Ankur Teredesai, Information Officer at the ACM Special Interest Group in Knowledge Discovery and Data Mining, shares some of SIGKDDs significant educational activities. We begin with a report from the ACM Education Boards recent panel on online learning. In June 2012, on the heels of Turing Centenary Celebration (TCC), produced and hosted by ACM at the historic Palace Hotel in San Francisco, the ACM Education Council convened for two days of lively debates, discussions, and updates on the state of computing education.

MOOCs Have Arrived: A Brave New World in Online Education


There were a number of important items on the agenda at the recent meeting of the ACM Education Council, but the one issue that colored nearly every other discussion was online learning, particularly massive open online courses (MOOCs). Mehran Sahami, ACM Education Board member and CS2013 Co-Chair, chaired a panel the subject with five special guests: Woodie Flowers (MIT), Candice Thille (CMU), John Mitchell (Stanford), Peter Norvig (Google), and David Patterson (Berkeley). Woodie Flowers, Professor Emeritus of Mechanical Engineering at the Massachusetts Institute of Technology (MIT), kicked things off by drawing a distinction between training and education. Training is a commodity, he remarked, and education should go deeper than that. He advocates working with students as opposed to lecturing to them. Flowers also stressed the imperative of creating the highest quality learning experience possible, and leveraging the buzz created by online learning to effect a Guttenberg moment in education. He also emphasized quality and sustainability over creating free materials. Candace Thille, Director of the Open Learning Initiative at Carnegie Mellon University, explained that design teams created their courses that include not only faculty members, but also learning experts, software engineers, and user experience designers. She talked about the importance of collecting data on the students learning experience to tweak and enhance the tools used to deliver it. Thille outlined the three key benefits generally attributed to online learning: (1) ability to do things you cannot do now (e.g. simulations, looking at things at microscopic and

10 acm Inroads 2012 September Vol. 3 No.3

edubits

macroscopic levels); (2) the convenience of learning anytime and anyplace; and (3) connectivity and ability to interact with students all over the world. The biggest power of technology, Thille said, is what Google calls the killer app, which enables educators to push technology just far enough for students to interact and draw out data about what a student is learning and using it for feedback. She places a high premium on data gathering and feedback: Students receive data about their own performance, instructors receive data about their teaching performance, and the designers receive data about the effectiveness of their materials and delivery. She also mentioned that an NSF-funded learning research center in Pittsburgh uses open learning initiative (OLI) online courses to enhance their research. OLI uses their insights on experimenting with learning theory to improve their courses, which is a wonderful symbiosis that advances the science of learning and educating. Experimentation was a theme strongly echoed by John Mitchell, Mary and Gordon Crary, Family Professor of Computer Science and Electrical Engineering at Stanford University, an institution that last year piloted a free course on artificial intelligence that attracted more than 160,000 learners. There are more questions than answers with online learning, Mitchell reminded the room, noting that theres room for lots of healthy debate and experimentation. Stanford is willing to see what works and to invest time in finding out. Already, Coursera, a company started by Daphne Koller and Andrew Ng, both professors at the university, has delivered thirteen Stanford courses. iTunes and YouTube have also been used to deliver Stanford courses, which remain property of the university regardless of delivery channel and platform. The key components of online technology proven to be effective are interactive video, automated assessment (that has value and can be scaled), social networking (interaction between students and staff), and collaboration tools. Mitchell himself has built Stanfords web platform with different collaboration tools. This, coupled with the free price model, has combined to make MOOCs (massive open online courses) popular, Mitchell said. Some of the questions Stanford is still trying to answer connect to the fundamental definition of education. It remains to be seen if low-lost, high-volume model can provide an experience that approximates traditional education. Another question is whether synchronous or asynchronous models work best. Other concerns include the potentially disruptive effects for traditional academia, institutional support for academic communities, and the viability of community colleges in the face of MOOCs. Will college teaching follow the path of journalism? Mitchell asked. Answering these questions wont be easy and will take time, experimentation, and thoughtful analysis. Peter Norvig, Director of Research at Google, who co-taught the now-famous Stanford MOOC on AI, came to be fascinated with the possibilities of online learning via a very traditional routewriting a college textbook. When he wrote the third edition of his textbook paper version of textbook, it was brittle. The book lacked A-B testing, interactive components such as 3-D models. He consulted Hal Abelson of MIT, who advised him to read Benjamin Blooms paper, 2-Sigma Effect. This 2-Sigma Effect posits that oneon-one tutoring can be very effective when the tutor is (a) an expert in both subject matter and pedagogy, (b) practices mastery learning (teaching until the students gets it rather than just lecturing and moving on), (c) establishes a deep personal connection with the student, and (d) listens to the students feedback so that s/he can adjust the instruction based on the students reaction and progress. In conducting the course, the personal connection between teacher and student had personally struck Norvig. Even though the connection was forged online, his responsiveness in forums and emails helped foster commitment on the students part. In terms of mastery learning and feedback, he admitted that we are not nearly at the level of one-on-one tutoring, but felt we can do a lot better if we keep working on a continual feedback and improvement loop. David Patterson, Professor of Computer Science at the University of California-Berkeley, who recently taught a course on software as a service (SaaS) with Armando Fox, noted that academics rate on an

2012 September Vol. 3 No. 3 acm Inroads 11

ed u b i t s
Digest of ACM Educational Activities

absolute scale while industry people rat on a relative scale. These MOOCs are a lot better than what other people have, he said. At Berkeley, he and Armando Fox borrowed the peer instruction model. He indicated that many tech startups value people who can demonstrate the practical skills required to do a job over those who can show good grades from a top university, hinting that MOOCs can help prepare motivated self-starters who care more about learning than grades and certificates for the workforce. Other benefits of running an online course, he discovered, are polishing the local campus course with the online course experience, working with many motivated and thankful online learners who value good teaching, helping alumni in fast-moving fields, and using technology Mathai Joseph and Mehran Sahami at the 2012 to educate teachers. On the other hand, Patterson admitted June Education Council meeting that teaching MOOCs is a lot of work. He also warned about the opportunities for cheating as well drawing attention to a 90% dropout rate. A vibrant and spirited discussion followed the panelists presentations. John Mitchell called attention to the social value of college and the traditional campus experience. He also pointed out that demographics from the Stanford MOOCs suggested that many learners enrolled because they were looking to do something intellectually stimulating, not necessarily because they were interested in the particular topics offered. Candace Thille reassured the Council that programs such as the OLI are not trying to replace faculty members but rather they help make them more effective. In the case of community colleges, they are collaborating with two-year schools to see how online courses can add value by figuring out whats working and whats not. Some Council members brought up the dangers of diving into online learning without giving thoughts to its consequences and disruptive effects on institutions. Eric Roberts cautioned about overvaluing courses just because they are offered for free and taking more time to evaluate the effects of online learning, especially at academic institutions that may not have the resources of top-flight programs like Berkeley, MIT, and Stanford. Chris Stephenson urged the group to consider the importance of including learning experts and scientists in this process, something Candace Thilles OLI group has been doing, and Flowers reminded the Council not to leave designers, videographers, and other technical folks out of the equation. The panel also debated whether some classes work better online, with Woodie Flowers asserting that some course content is a better fit for online delivery and Dave Patterson arguing that this is irrelevant, citing Skypes success as a ubiquitous communication and collaboration tool. The panelists encouraged ACM to consider the contributions it could make to this exciting sea change in education and its implication on Education Council initiatives in Advance Placement Computer Science, CS2013, and K-12 projects. Candace Thille encouraged faculty to get involved with OLI by contributing to core curricula development as well as review outcomes and provide feedback and reiterated the importance of working together with learning experts across different disciplines to refine and optimize learning tools. With so much room for debate, the one thing everyone seemed to agree on was that this new technology makes MOOCs cost effective and scalable, there is growing demand for accessible content from all corners of the world, online education is here to stay, and we should all be paying attention. We now look at SIGKDDs educational work, notably efforts to address industrys growing demand forand shortage ofdata scientists.

12 acm Inroads 2012 September Vol. 3 No.3

edubits

Nurturing the Next Generation of Data Scientists:


The Role of SIGKDD in Big Data, Data Science, and Data Mining Education
In the age of Big Data, the importance and demand for data science skills is increasing tremendously. The ACM Special Interest Group in Knowledge Discovery and Data Mining (SIGKDD) is a professional organization that represents experts from both academia and industry, with a broad interest in all aspects of data science. SIGKDDs main activity is to inform and advance the domain of data science through the annual KDD conference, which has become the flagship conference for practitioners of this computing discipline since it started as a workshop in 1989, and matured into a conference in 1995. In 2012, the annual KDD 2012 conference will take place in Beijing, China. SIGKDDs emphasis on promoting data science learning through its educational outreach efforts is driven by the growing demand for data science and data mining professionals. A recent McKinsey study1 warned about a shortage of talent with deep analytical skills, to the tune of 140,000 to 190,000 professionals by 2018 just within the United States. To put this number in perspective, consider the overall statistics for the entire computing discipline, where various universities in United States graduated approximately 1,700 students with a Ph.D., approximately 10,000 students with a Masters degree, and approximately 21,515 students with a Bachelors degree in computing disciplines last year2 (2011) (source: CRA Taulbee survey results). The subset of students graduating with skills in the Big Data areas is a small component of this overall population.

Driving Data Science Curriculum


Based on this need, the SIGKDD Executive Committee formed and charged the SIGKDD curriculum committee to design a sample curriculum for the field of data mining that provides recommendations for the education of a data science aware workforce. This committee consisting of several prominent SIGKDD members including Soumen Chakrabarti, Martin Ester, Usama Fayyad, Johannes Gehrke, Jiawei Han, Shinichi Morishita, Gregory Piatetsky-Shapiro, and Wei Wang solicited feedback from researchers, educators and students from leading institutions to design a conceptually strong and technically rich curriculum. The curriculum included advice on such key topics as data preprocessing and cleaning, online analytical processing (OLAP) and data warehousing, association and classification topics, and clustering techniques that form the foundation of most data mining courses. The curriculum set the tone for academic advancement of data science in one of the most critical periods of its development and it continues to evolve. Several data-mining instructors teach basic and advanced level data mining courses each year based on the draft curriculum with some modifications to keep pace with the current advances. The formation of a new curriculum committee is also in the works to revise further the draft curriculum. One could find details of this curriculum at http://www.kdd.org/sites/default/files/CURMay06.pdf.

Access to Data Sets and Unique Challenges


Another SIGKDD initiative that has fueled the growth of skills and interest in learning data science is the annual KDD Cup competition, which gives data science enthusiasts and seasoned practitioners from all over the world the opportunity to display their skills in developing solutions to deeply technical data mining problems. With the importance of Big Data growing, while access to data sets to practice remaining a challenge, the contest has attracted thousands of students in hundreds of universities worldwide to
1 http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation 2 http://cra.org/uploads/documents/resources/taulbee/CRA_Taulbee_2011-2012_Results.pdf

2012 September Vol. 3 No. 3 acm Inroads 13

ed u b i t s
Digest of ACM Educational Activities

engage in developing tools and techniques to solve real-world problems such as INSERT. The most wellknown KDD Cup contest, which was held in 2007 in conjunction with the Netflix Prize competition, set the stage for the BellKors Pragmatic Chaos team eventually winning the million dollar recommendation systems prize. This years KDD Cup 2012 contest, organized by the Chinese Internet company Tencent, features the largest data set ever made available for a KDD Cup contest and has already attracted over 800 teams. Other ACM SIGs are also following this model of education blended with innovation, such as the ACM SIGSPATIAL GIS-focused algorithm competition called SIGSPATIAL Cup 2012.

Broadening the Horizons of Data Science


During the week of the annual conference, there are several tutorials offered by experts and several hundred attendees benefit from learning about the latest hot-topics. This year, we also feature a KDD summer school, giving students a unique opportunity to build on their data science knowledge before they are exposed to in-depth discussions at the conference. A special panel on timely industry issues, such as Big Data, further hones their skills and prepares them for the challenges and opportunities of our dataintensive world. SIGKDD as a community continues to foster long-term relationships between academia and industry and another notable educational feature of the conference is the recently initiated Industrial Practice Expo sessions, where notable KDD practitioners from industry discuss their implementation experiences. As the interest in Big Data and data science disciplines continues to grow, SIGKDD looks forward to continued success in fostering its educational mission and welcomes suggestions for new initiatives at its website www.kdd.org. Ankur Teredesai Associate Professor, Institute of Technology University of Washington Tacoma Information Officer, SIGKDD

Andrew McGettrick
Chair, ACM Education Board Professor of Computer Science, Strathclyde University, Glasgow, UK
Andrew.McGettrick@cis.strath.ac.uk

Yan Timanovsky
Liaison to the Education Board ACM Headquarters, New York, New York USA
timanovsky@hq.acm.org

DOI: 10.1145/2339055.2339058

Copryright held by authors.

14 acm Inroads 2012 September Vol. 3 No.3

toce.acm.org

Computing Education
Co-Editors-in-Chief
Dept. of Computer Science and Engineering University of Connecticut robert@engr.uconn.edu

ACM Transactions on

Robert McCartney

Institute of Technology Computing and Software Systems University of Washington, Tacoma jtenenbg@u.washington.edu

Josh Tenenberg

The ACM Transactions on Computing Education (TOCE) publishes high quality, peer-reviewed, archival papers in computing education. Papers published in TOCE take a scholarly approach to teaching and learning, establish a clear connection to student learning, and appeal to a broad audience interested in computing education: instructors, researchers, curriculum designers, and administrators. The topics covered by TOCE will range across diverse aspects of computing education including traditional computer science, computer engineering, software engineering, information systems, information technology, and informatics; emerging aspects of computing; and applications of computing to other disciplines, such as computational biology. For more information, please visit the journal's homepage or contact the Co-Editors-in-Chief.

Lecia Barker University of Texas Mordechai (Moti) Ben-Ari Weizmann Institute of Science Stephen Edwards Virginia Tech Robert Friedman University of Washington, Tacoma Joanna Goode University of Oregon Mark Guzdial Georgia Tech Irit Hadar University of Haifa Susan Haller SUNY Potsdam Orit Hazzan Technion - Israel Institute of Technology Cay Horstmann San Jose State University Christopher Hundhausen Washington State University Deepak Kumar Bryn Mawr Michael Klling University of Kent Lauri Malmi Helsinki University of Technology Manuel Prez-Quiones Virginia Tech Susan Wiedenbeck Drexel University

Editorial Board

fea t u re d c o l u m n s

Taking the High Road


C. Dianne Martin

this. The intriguing question for your students to ponder is, should computing professionals use their technical skills in the production of pornography, especially simulated child pornography? As they investigate this question, they should examine the literature on the correlation that has been found between pedophilia, rape, and addiction to porn.

Some Surprising

Social Impact Scenarios


When teaching ethics and social impact in computing courses, it is always helpful to have current, highly relevant case studies for students to examine. In this column I will provide several recent issues that relate to how the courts have interpreted the application of real world laws to virtual worlds with some surprising outcomes. Students find these issues to be quite provocative, and many perspectives about the issues can be easily found on the internet to enrich the discussion. A good framework to use for each of the discussions is as follows: What role does the internet play in the situation? Describe the societal consequences. Describe the legal context. What is in the best interest of society?  What is the social responsibility of IT professionals? as the song goes in the musical Avenue Q, The Internet Is for Porn! In Supreme Court decisions in the 1960s and 1970s, pornography became protected speech under the First Amendment of the US Constitution. Similarly in the Supreme Court decision striking down the 1995 Communications Decency Act, it was ruled that any type of speech protected in print media was similarly protected on the internet. In addition, it was ruled that any library receiving federal funds is required to allow adult patrons to use public computers to access porn. The one exception to this protected category has been child pornography, defined as knowingly producing, promoting, directing, exhibiting, or selling any material showing a sexual performance involving a child under the age of 16. Child pornography distributed over the internet became the main criminal act related to pornography and indecency that has been prosecuted vigorously by law enforcement around the world. But there is a new wrinkle related to chld porn laws. The law was designed to protect children from being abused, not to prevent adults from having access to adult material. If it turns out, as has been true in several recent cases, no actual child was used to produce the images that is, they were produced using very high quality animation then the question has arisen as to whether the result is actually illegal. So far the courts have been mixed about

Bullying and Sexting as Protected Speech


The ACLU has a single mission and that is to protect free speech. This single-mindedness has sometimes caused the ACLU to take positions that are counterintuitive, such as defending the KKK or being against filtering software to protect children from adult content. More recently the ACLU has come under fire for taking a stance against certain anti-bullying and anti-sexting policies being established by school systems, local governments, and even federal lawmakers. In cases where the ACLU determined that schools were over-reaching their authority by monitoring the online activity of students, they filed suits against the school systems for inhibiting the free speech rights of students. The ACLU took the position that online speech, especially when not conducted on school property, was protected speech, even it if was offensive speech. The latest social networking concerns arise out the equally new practices of cyberbullying and sexting using the online world to harass or titillate someone. Both of these phenomena are simply variations on the same theme an individual, typically a young person, engaging in disfavored online communication. While cases of severe harm are rare, its important to note they do

Porn as Protected Speech


One of the unexpected consequences of the explosion of the world wide web in the 1990s was the proliferation of porn sites. In fact the pornography industry was one of the first to make large profits on the internet and was responsible for many of the technological improvements in graphical, audio and video delivery over the internet. In spite of all the high-minded efforts to characterize the internet as a place for education, communication, entertainment, and commerce, it turns out,

16 acm Inroads 2012 September Vol. 3 No. 3

illustration : Fotosearch.com

featured columns

exist. The response to the perceived threat by those policymakers who believe it is the governments role to play supreme protector and moral arbiter, even when the impact is extremely infrequent, has been swift and sweeping with many proposals to ban online bullying and sexting at the federal and local levels.1 Students have been suspended from school, banned from extracurricular classes and barred from graduation ceremonies for posting comments online when they were not at school. Even more extreme, young teens have been charged with child pornography after having been turned over by school officials to local law enforcement officials for sexting. To counteract this blemish on their image, the ACLU has been very aggressive in helping to protect the rights of gay youth, often the target of vicious online bullying, by developing and promoting education programs for schools to implement as a strategy to alleviate the problem of bullying. The response of the ACLU in all of the free speech cases it takes on is to argue for more speech (that is positive education programs) rather than restricted speech. For class discussions, Students interested in this topic should include investigation into recent tragic cases resulting in suicide as well as attempts by local jurisdictions to take criminal action against the students who cyberbully or do sexting.

Bridge the Past with the Present

belong to the game, not to the players creating them. Under a very strict EULA, it is illegal to sell the artifacts to someone else. On the other hand, in Second Life all artifacts created in the environment belong to the creators. It is perfectly legal to sell items created in Second Life to other participants. It is also illegal to steal intellectual property created in Second Life. Yet another twist to this issue involves making virtual copies of branded items creating a Gucci handbag, Ralph Lauren sunglasses, Nike sneakers and selling them to other players. It turns out that violation of trademark and copyright by creating virtual knock-off products can be prosecuted in real courts. Those found guilty have been given hefty fines for trademark or copyright violation. These examples are intended to stimulate discussion in computer ethics courses that cover topics such as free speech, privacy, and intellectual property. A quick Google search will provide numerous references to provide pros and cons for each topic. Students will be intrigued by the unexpected legal twists and turns that result from applying real world law to the online environment. Hopefully, they will also become more aware of unintended consequences that often result from new technologies. Ir
C. Dianne Martin The George Washington University 2121 I Street, NW Washington, DC 20052 USA dmartin@gwu.edu

IEEE Annals of the History of Computing

Feature Articles Events and Sightings Reviews Biographies Anecdotes Calculators Think Piece

Protecting Intellectual Property in Virtual Worlds


Another interesting case study for students to study and debate relates to the application of intellectual property laws to virtual worlds. Each virtual world, whether it is a game such as the World of Warcraft (WoW) or a multi-user 3-D space like Second Life, comes with End User Licensing Agreements (EULAs). Some EULAs maintain a strict control on all intellectual property created within the environment, so for example, WoW maintains that all artifacts created within the context of the game belongs to the Blizzard game company. Weapons, creatures, avatars all
1

Subscribe today!
http://computer.org/subscribe/

 acleod-Ball, Michael. Student Speech Online: Too M Young to Exercise the Right to Free Speech? I/S: A Journal of Law and Policy for the Information Society, Vol. 7; Issue 1; Winter, 2011, pp 102-132.

DOI: 10.1145/2339055.2339059 Copryright held by author.

2012 September Vol. 3 No. 3 acm Inroads 17

fea t u re d c o l u m n s

Reflections
Deepak Kumar

Data Science Overtakes Computer Science?


Quick, read the question below, then close your eyes and try to answer it. What is an exabyte? An exabyte, or EB, is a billion gigabytes, a million terabytes, or the amount of memory you can access with a 64-bit address (16 EB to be more precise). You can buy a 1-terabyte hard disk today for less than $100. A million terabytes of storage is still a sizable investment. Ever wonder what you would do with all the data if you manage to fill it all up? As humanity we are now generating exabytes of data on a daily basis. A 2010 article in the Economist magazine estimated that in 2005 we created a total of 150 exabytes of information [1]. Estimates of 1200 exabytes were thrown about for 2010. It must be good to be in the data storage industry. One of the grand challenges of our time has to be the management, storage, and handling of large amounts of data (not to mention the amount of energy this requires). While there are definite advantages to having so much data available, it is also becoming increasingly difficult to process and exploit so much data. The Economist article calls it plucking the diamond from the waste. Luciano Floridi, a self-proclaimed philosopher of information (University of Hertfordshire, UK) predicts that soon we will be drowning in the age of the zettabyte data deluge [2]. One of the things that excite me the most about computer science is how it is constantly pushing the boundaries. The stuff at the edge: artificial intelligence in the 1980s, for example. Glancing over at my bookshelf recently I noticed that my AI books have gradually receded into the far bookcase of my office. They used to be right behind me so I would have ready access. Now, migration at this scale may not mean anything, but youd have to see the books I found myself starting at, in that prime spot. I now possess a couple of shelves of books on data, visualization, statistical analysis, and related material. Despite repeated warnings from mass media in the last few years about the data

deluge somehow I have caught myself sitting at the edge of yet another boundary: this idea of the increasing centrality of data or information. For the past two years I have been involved with the Center for Science of Information (more on that in a later installment, for now you can visit www.soihub.org if curious), have recently finished reading James Gleicks latest book, Information: A History, A Theory, A Flood [3], attended a day long workshop on Presenting Data and Information [4], and have been inspired to include information visualization in a chapter for a new introductory computer science book that includes the visualization shown below [5]. In this chart, April 1, 2010 was when the iPad was launched. The red line is showing the 25-day moving average of the stock price. This data is readily available (see source cited in chart) in almost any format making it easy for students to extract and exercise with. One of the best ways to explore and try to understand a large dataset is with visualization, writes Nathan Yau [6]. While the chart above is based on less than 300 data points, it nevertheless depicts the power of gathering, processing, and visualizing a dataset in the simplest of visualizations: a time series. This is something we can easily integrate in our early computing courses. There is a small but (exponentially?) growing population of folks who have been

18 acm Inroads 2012 September Vol. 3 No. 3

featured columns

throwing the term data science around for the last few years. Just this week, I came across the following tweet [14]:

computer science
acquire

parse

R
filter

mathematics, statistics, and data mining

graphic design

mine

represent

refine

R
interact

infovics and hci

Before this phenomenon passes us by, here is a quick crash course on data science by way of some hype: Data Science is a valuable rebranding of computer science and applied statistics skills.  -: David Smith [13] And another tweet [7]:

The ability to take datato be able to understand it, to process it, to extract value from it, to visualize it, to communicate itthats going to be a hugely important skill in the next decades. -: Hal Varian [8] Big Data, as the other term that is being thrown around, is attracting lot of attention. A number of funding initiatives and awards are regularly being announced: NSFs BIGDATA initiative [9], Intels funding of a Big Data center at MIT [10] are just two recent examples. Only a few months ago, the term data science didnt even have an entry on Wikipedia. Only recently the following definition was offered [11]:

I hope this sets the stage. My reason for writing this column is to bring this idea into the consciousness of those of us who teach computer science. In our community the importance of using real world data in our courses has long been recognized. So it is only natural now to become more aware of the rigors of this evolving science as well as the importance of educating our students in the techniques and skills it engenders. It offers the opportuni-

ty to integrate the learning of computing, with algorithms and structures for storage of data, the means for acquiring it, doing statistical analysis, combined with the modern tools for visualization and interactivity. Even though, the Apple Inc. dataset above is less than 300 data points, it embodies sufficient richness for beginning students to explore and become proficient in and to realize the importance of doing data science. The process of data visualization itself was formalized by Ben Fry in his PhD thesis, Computational Information Design, where he outlined the various stages of creating data visualization: acquiring the data, parsing it, filtering it, mining it, choosing a visual representation, refining of improving the visual representation, and finally making the visualization interactive [12]. Still, the data visualization process relies on ones creativity and sense of aesthetics. Also, as Nathan Yau points out, there are a number of dos and donts that one must keep in mind [6]. These include obvious, yet important things like checking the data that forms the basis of a visualization to make sure that it is free of errors and is valid. Moreover, all visualizations should be properly labeled to include the coding(s) used in the visual representations, labeling the axes, including the units, citing the proper sources, etc. Visual representations, as effective as they may be, can also create false impressions if there are flaws in the geometry of the chosen graphics. For the more advanced students, we have the opportunity to create courses that directly address the issues surrounding Big Data. The foundations, beyond computing, lie in advanced statistics, creative visualization, human computer interaction etc. The picture above depicts Ben Frys perspective [12]: There are increasing numbers of real

world examples of doing data science: the Page Rank algorithm; creating business value out of social networks; and applications like predicting global flu trends. Data science is indeed a rich domain and therefore promoting expertise though our courses represents a natural evolutionary path for our curricula. In the end it isnt important whether data science will overtake computer science as a top IT skill. This trend, like the others, too might pass. But the reality of the data deluge will very much persist and as long as we as humanity keep populating those storage devices, we will probably never run out of the need for data scientists. Ir
References [1]  The Data Deluge, The Economist, Feb 25th 2010. [2]  Luciano Floridi, Information: A Very Short Introduction, Oxford University press, 2010. [3]  James Gleick, The Information: A History, A Theory, A Flood, Pantheon Books, 2011. [4]  Edward Tufte, Presenting Data and Information: A OneDay Course. www.edwardtufte.com [5]  Ira Greenberg, Dianna Xu, Deepak Kumar, Processing: Creative Coding and Generative Art, FriendsOfEd, 2012, forthcoming. [6]  Nathan Yau, Visualize This: The Flowing Data Guide to Design, Visualization, and Statistics, Wiley, 2011. [7]  Christian Langreiter, Tweet (@chl) at 6:10 AM on September 28, 2011. [8]  Hal Varian, On How the Web Challenges Managers, The McKinsey Quarterly, January 2009. [9]  National Science Foundation, Core Techniques and Technologies for Advancing Big Data Science & Engineering (BIGDATA), Solicitation 12-499, 2012. [10]  MIT News, MIT, Intel unveil new initiatives addressing big data, May 31, 2012. [11]  Wikipedia, Data Science. http://en.wikipedia.org/wiki/ Data_science [12]  Ben Fry, Computational Information Design (PhD Thesis), Massachusetts Institute of Technology, April 2004. [13]  David Smith, Revolutions Blog (http://blog.revolutionanalytics.com/2011/09/data-science-a-literature-review.html), September 2011. [14]  Tweet from CS News Update at 9:54PM on May 21, 2012.

Deepak Kumar Computer Science Bryn Mawr College Bryn Mawr, Pennsylvania 19010 USA dkumar@brynmawr.edu

DOI: 10.1145/2339055.2339060 Copyright held by author.

2012 September Vol. 3 No. 3 acm Inroads 19

fea t u re d c o l u m n s

is education
Heikki Topi

Essential Role of Practical Projects in

Information Systems Courses


Is the information systems (IS) discipline by definition an applied discipline? Should the field be focused on the practice of planning, designing, implementing, and maintaining IT-based systems in organizations? Is information systems practice in the core of the discipline? These are among the questions that the relatively frequent introspective processes focusing on the identity of the IS discipline have asked, mostly in the context of scholarly work. The same question is, however, also relevant in any discussion regarding IS education. As IS educators, we are preparing most of our students for practical work in private enterprises, public sector, or non-governmental organizations. Only few of the students continue on a path that will lead to a research career. Still, our perspectives vary significantly regarding the extent to which collaboration with real-world organizations and project work mimicking organizational projects should be included in the implementation of an IS curriculum at the undergraduate and graduate levels. Some faculty members believe that it is sufficient to have a single simulated project in the program capstone and no other project work; others would like to see real-world projects featured in as many courses as possible. Those who are against frequent use of significant projects in IS courses have a number of sensible reasons with which to justify their perspective. Some are afraid that using the limited time available in a typical IS major for projects will prevent us from covering a sufficient amount of

content systematically and predictably. Others worry about the case specificity of learning associated with projects and the students inability to abstract from the learning in projects to a higher level. Many are concerned about the intensity of faculty resource need in practical projects or the potential liability associated with real-world projects conducted by students with relatively little experience. Those who start their studies from scratch are not able to produce meaningful technical deliverables much before the mid-point of their studies. Systems projects are notoriously unpredictable, and scope creep is as likely in course projects as in others. There, indeed, are plenty of reasons to not include projects in most IS courses. Some of them are valid, and others either conscious or unconscious attempts to keep faculty workload at a reasonable level. I believe, however, that the benefits of including significant project work outweigh the costs and that including project work with real organizations is possible in the context of most modern curricula. Why bother, given all the costs? Primarily because the learning processes associated with projects will lead to improved student learning outcomes. Practical projects will help students identify and remember connections between various phenomena and topics significantly better. For example, even simple development projects start to show the importance of data management topics early. A student who has faced data management challenges in an earlier project context is much better prepared and typically more highly motivated to study data management when it is time for it in the curriculum. In the same way, any non-trivial course project will effectively demonstrate the need for project

20 acm Inroads 2012 September Vol. 3 No. 3

can stock photo/sean

featured columns

management. Even though this topic cannot be taught in the early stage projects in the program, the students will be in a much better starting point for a project management course if they have learned

Through the improved understanding of the organizational context in projects, many of the organizationally focused topics elsewhere in the program will start to make much more sense earlier. This will build conceptual structures that facilitate all learning.
to understand why project management matters through their own experiences. Real-world projects have the additional advantage that they teach students about the contexts in which they will be applying their IS capabilities. The earlier this happens, the more beneficial it is for the students from the perspective of all learning in their program. Through the improved understanding of the organizational context in projects, many of the organizationally focused topics elsewhere in the program will start to make much more sense earlier. This will build conceptual structures that facilitate all learning. Discussions with advisory groups, articles in trade press, and curriculum recommendation documents systematically highlight the need for improvement in oral and written communication, ability to work effectively as a team member, analysis and articulation of system requirements in light of overall organizational

goals, understanding of tradeoffs required because of resource constraints, etc. These are capabilities that cannot be learned in a highly condensed way in a single, specialized course. Instead, students need to be frequently and consistently required to practice these skills and get feedback; this is unlikely to be possible without project work. Asking students to write reports, give presentations, conduct interviews, evaluate the financial prospects of various options, and to perform other similar tasks frequently in project contexts is not busywork but an essential component of the preparation process for typical roles of an IS professional. Projects implemented in real-world contexts have another additional benefit: they expose the students to the complexities of getting work done in a situation where the project is not the highest priority for many participants. These projects will force students to convince the organizational partners about the importance of the project and communicate effectively the justification underlying an initiative. During the corporate projects, the students will develop important connections, expand their professional networks and learn how to continue to further strengthen and expand them effectively. The benefits of real-world projects go beyond those gained by students. A

university department that is successfully connected with a number of organizations and serving them well with student projects has in these organizational partners an excellent source for feedback regarding student performance and curriculum needs. Faculty involved in course projects will be able to maintain a strong understanding of contemporary practices and the challenges organizations are facing in their work to apply IT solutions effectively. My goal in this column has been to encourage all IS programs to think systematically about the role of both real and simulated projects in their curriculum and find ways to incorporate project work in courses throughout the program. This will be beneficial for the students, the faculty, and the program and department. Yes, there will be extra work associated with the project but it will ultimately pay off. As always, I welcome your feedback. Ir

Heikki Topi Bentley University 175 Forest Street Waltham, Massachusetts 02452 USA HTopi@bentley.edu

DOI: 10.1145/2339055.2339061 Copyright held by author.

Join

AIS

Association for Informations Systems


http://plone.aisnet.org/

2012 September Vol. 3 No. 3 acm Inroads 21

fea t u re d c o l u m n s

Classroom Vignettes
Henry M. Walker

Course Planning:

The Day-by-Day Course Schedule


When I was in college, friends at various schools sometimes complained that scheduling in a course had gone badly. The instructor had planned to cover eight books, but had spent the first ten weeks on just two books. Thus, the instructor was trying to squeeze the six remaining books into the last four weeks. Partially in reaction, I have prepared tentative, day-by-day class schedules for every college-level course I have taught. Although my initial motivation was to avoid problems of bad time management, I have learned the benefits of extensive and detailed course planning extend well beyond simple matters of pacing. This column identifies benefits and motivations of developing a detailed course schedule, potential pitfalls, and some practical approaches for constructing a schedule. the selection of examples is open. (If an instructor is not careful, extensive coverage of numerous interesting examples on one topic can interfere with the discus-

Motivations and Benefits


Initially, I developed day-by-day schedules as a mechanism to monitor the pace and sequencing of material in a course. Courses must cover specified material to prepare students for subsequent courses. Often topics have varying priorities: some high-priority topics must be covered, but other topics may be optional. Also, it may be important to provide examples, but

Courses must cover specified material to prepare students for subsequent courses. Often topics have varying priorities: some highpriority topics must be covered, but other topics may be optional.
sion of other topics. For example, five examples of recursion in CS1 may be helpful but fifty examples may prevent coverage of other essential topics.) Of course, a detailed schedule also helps clarify sequencing of topics. When topics

build upon each other, a tentative dayby-day schedule allows an instructor to consider what background students will know when introducing each new topic. A tentative, day-by-day schedule also can help in the selection of textbooks. For example, when I first came to Grinnell, my colleagues (particularly Charles Jepsen) would identify two to five possible textbooks for a multi-section course such as Calculus I and II; Charles would prepare possible day-by-day schedules for each book. Often, these schedules would show that some textbooks could work well within a courses time constraints, while other textbooks would not. Typically, potential daily schedules helped reduce an initial list of five books to just one or two candidates. Over the years, I have added much detail beyond pace and sequencing to my day-by-day schedules.  A schedule can show readings for each day. (I often ask students to submit discussion questions on each reading electronically before class, so they will be prepared and I know what to highlight in class.)  The daily schedule can show due dates for assignments, projects, or other work. In a course that meets four days a week, I may plan a mix of labs and individual programming assignments to be due every third class day. The day-by-day schedule helps space these materials evenly throughout the semester.  I expect to be at a conference or on a consulting trip during the semester, I may schedule a test for a day I will be away. (My colleagues have been wonderful at covering for me on such occasions). A day-by-day schedule helps fit the test into the overall flow of a course.  When considering due dates, I try to be sensitive to culturally-based events such as religious holidays. Planning ahead helps me avoid such dates for tests or other activities.  Variety in a course may be encouraged with a range of planned activities. For example, several of my colleagues (e.g., Janet Davis and Jerod Weinman) specify Mondays as news days in

22 acm Inroads 2012 September Vol. 3 No. 3

featured columns

which students present recent news articles they have encountered. Similarly, I might schedule group activities for Tuesdays, or discussion questions for Wednesdays. Each activity can add a valuable dimension to a course, but a schedule must include time for each activity. After I have made a day-by-day schedule to organize a course, I distribute the schedule to students on the first day of class. Of course, the initial schedule is tentative, and I tell students that adjustments are likely through a semester. (Typically, an occasional topic may shift by a half day a full day shift is possible, but rare. More often, I may drop an optional topic to keep the overall course on schedule. Dates of assignments and tests do not change, although the content for these activities may vary.) Students regularly remark that a day-by-day class schedule helps them in planning. Students know when assignments, projects, etc. will be due; and they are able to fit course work with other commitments. Of course, distributing a tentative, day-by-day schedule at the beginning of a course also reassures students that an instructor is on top of the material and has the course well prepared.

credit. A published schedule might or might not label these topics explicitly, but the instructor will know which topics can be abbreviated or dropped as needed.  Some days may be labeled as catch up, questions, review or equivalent; my colleague, John Stone, instituted scheduled days to Pause for Breath, and my colleagues, Janet Davis, Samuel Rebelsky, and Jerod Weinman, have adopted versions of this approach. In practice, I consider a tentative, dayto-day schedule as a baseline reference, not a mandated rule. If interactions with students indicate more time is needed on a topic, I review the schedule to determine how to adjust upcoming plans or which topic to drop. If material moves faster than expected, the schedule suggests what might come next.

Some Approaches for Constructing a Schedule


In practice, development of a tentative, day-by-day schedule involves an iterative process. When I start planning a new course, I typically go through a textbook or list of topics, assigning time for each book section or topic. I consider what might be involved with each part of the material,

and I estimate an amount of time. Typically, my initial schedule provides a lovely day-by-day plan that may extend to 20+ weeks but Grinnells semester only has 14 weeks of classes. The iterative process then examines each part of each topic and class: what can be combined, what might be omitted, could reordering make coverage more efficient, do I really need all five examples, etc.? Throughout this process, I make an adjustment and examine the schedule again. While this iterative process may be accomplished in many ways, here are several well-tested approaches.  Write topics in a text file, with a blank line between each days material. Use a program to read the file to yield a weekly table of days and topics. (Aside: this can be a great CS2 exercise, if you do not already have access to such a program.) After revising the material for a day, rerun the table generator to view the revised program. An example of such a table for a course on Algorithms and Objected Oriented Design may be found at http://www. cs.grinnell.edu/~walker/courses/207. sp12/sched.pdf (TeX format) and http://www.cs.grinnell.edu/~walker/ courses/207.sp12/lab-index.shtml (html format).

Potential Pitfalls
Although preliminary course planning can be very helpful, each offering of a course is different different students have different backgrounds, abilities, and interests. Some topics may require more time one semester than another; some students may ask more or deeper questions; technical elements of a lab environment may make work particularly efficiently or may add unforeseen difficulties. Altogether, a tentative, day-by-day schedule aids planning, but an instructor also must be flexible. Several techniques can help an instructor adjust to variations from one course offering to the next.  Some topics can be identified as essential or core these must be covered.  Some topics can be specified as time permits or optional or for extra Figure 1: Post-it Notes for Course Planning (with permission from Janet Davis)

2012 September Vol. 3 No. 3 acm Inroads 23

fea t u re d c o l u m n s
Course Planning: The Day-by-Day Course Schedule
 Use a spreadsheet to identify topics, with one class per row. The spreadsheet tracks successive days; counting days gives a weekly or monthly schedule; and insertion and deletion of rows allows easy adjustment of the schedule.  Use post-it notes to record each course topic, and make a large grid on a whiteboard or poster for each week. Put the notes on the grid to obtain a tentative course outline; move the notes from one position to another for course revision. Figure 1 shows this approach, as developed by colleague, Janet Davis, for two courses, CSC 105, The Digital Age (left) and CSC 364, Networks (right). The colors of the post-it notes identify types of activities or themes (e.g., the right column for Networks uses green to indicate lab topics).  Divide the course into modules (e.g., one to two week blocks), giving a high-level organization of topics and materials. Then work within each module to obtain a daily schedule. An example of the module approach may be found at http://www. cs.grinnell.edu/~walker/courses/161.sp12/ semester-outline.shtml .

Computing Education Research


Raymond Lister

A Variation on Kvales

One Thousand Page Question


In his excellent book on the use of interviews in research, Kvale (1996) begins Chapter 10 by paraphrasing a question he is often asked, which he calls The One Thousand Page Question: How shall I find a method to analyze the 1,000 pages of interview transcripts I have collected? The rest of his chapter is a witty word by word deconstruction of that question. To give you a feel for Kvales analysis, I merely need list some of his chapter subheadings: 1,000 Pages Too Much!, Have Too Late!, and How Ask What and Why First. If you are intending to collect interview data, I recommend you read Kvale before you start interviewing. Kvales 1,000 page question led me to my own variation on his question; in response to something I am often asked: I just finished teaching a course that I completely rewrote. How shall I find a research method to prove that my change improved student learning? you are hoping to demonstrate improved student learning, then you will probably need to give the students a pre-test at the start of semester. In fact, I think you will probably need to give the students several

Over the years, I have found course planning forms the foundation for my teaching. Development of a tentative, dayby-day schedule takes significant time particularly for courses I have not taught previously, but this initial work makes a dramatic difference during a semester. I know what to focus on during the stresses of a semester, and I have reasonable confidence each course will cover appropriate material in a timely way. Ir

If it is your plan to research the effect of your educational innovation, then you need to plan your research before you start teaching.
tests throughout semester. Those tests might also be used as part of the grading of the students.

Acknowledgment Many thanks to Marge Coahran for her suggestions on an early draft of this column!

Henry M. Walker Dept of Math and Computer Science Grinnell College Grinnell, Iowa 50112 USA walker@cs.grinnell.edu

Completely rewrote?
Too Much! I am increasingly skeptical that it is possible to demonstrate improved student learning in a course that has been completely rewritten. Over the duration of an entire semester, there are too many confounding factors. It is possible to measure some things for a completely

Just finished?
DOI: 10.1145/2339055.2339062 Copyright held by author.

Too Late! If it is your plan to research the effect of your educational innovation, then you need to plan your research before you start teaching. For one thing, if

24 acm Inroads 2012 September Vol. 3 No. 3

featured columns

rewritten course, such as whether the students enjoyed the course, but I doubt it is possible to demonstrate improved student learning. If improved learning is what you want to study, then I suggest you start your research by rewriting just one week of your course. Lets call that week X. You should measure the students prior knowledge just before they encounter your new material, ideally immediately before you begin teaching your new material, but measuring their prior knowledge in week X1 is probably acceptable and more pragmatic. Ideally, test them immediately after you finish teaching the new material, but even if you do you should also test them in week X+1, to see how much they retained. (I hope your retention tests are more encouraging than mine have been recently.)

dents in a pair performed similarly on the week X1 test. This approach is called a matched pairs design. While it is a fairly convincing design, it is not as convincing as a randomized, controlled design, but most of us do not have the luxury of doing that most rigorous of tests.

How?
Ask What and Why before How. Foasati and Guzdial (2011) interviewed fourteen computer science educators from three different institutions, to investigate what evidence triggers change in the practice of computer science educators. They found that that change was mostly initiated on the basis of the educators intuition and anecdotal evidence. Rewriting an entire course on the basis of intuition and anecdote is like betting a semesters worth of your salary on the roulette wheel you might get lucky and win big, but you probably wont. Instead of playing pedagogical roulette, you should play pedagogical backgammon. But before you start playing, you should learn the pedagogical equivalent of counting cards. That is, before you begin rewriting your course, you should systematically gather evidence about the existing course, to make an informed guess as to what the learning difficulties are, and why students have them. Only then should you start thinking about the how.

As time goes on, I find myself changing my lessons less and systematically observing my students more. I am less inclined to make large, random changes. I am more inclined to make small, deliberate changes. In computing education, I think we have been looking for the silver bullet that will solve all our problems all at once. I think we need to settle down and become more patient, incremental researchers. Ir
References [1]  Fossati, D. and Guzdial, M. (2011). The use of evidence in the change making process of computer science educators. In Proceedings of the 42nd ACM technical symposium on Computer science education (SIGCSE 11). ACM, New York, NY, USA, 685-690. DOI=10.1145/1953163.1953352 [2]  Kvale, S. (1996) InterViews: An Introduction to Qualitative Research Interviewing. Sage Publications. [3]  Randolph, J. (2007) Computer science education research at the crossroads: A methodological review of the computer science education research: 2000-2005. PhD dissertation: Utah State University. http://www.archive. org/details/randolph_dissertation [Accessed June 2012].

Prove that my change improved student learning?


What I described is the previous paragraph is known as a one-group posttest (Google it) and it a weak experimental design that does not prove that you improved anything (Randolph, 2007, pages 141 and 164). What you also should do is test your class in week X1, week X and week X+1 in the semester BEFORE you make your change. You then compare students by forming pairs of students, with one student from each semester, where both stu-

Raymond Lister Faculty of Information Technology University of Technology, Sydney Broadway NSW 2007 Australia RaymondLister@uts.edu.au

DOI: 10.1145/2339055.2339063 Copyright held by author.

Check out the website of

CSAB

www.csab.org
2012 September Vol. 3 No. 3 acm Inroads 25

fea t u re d c o l u m n s

Distance Education
Marian Petre
value element of judgment about which information is most important, in what order to study it, and how to consider key elements in relationship.

Interpretation and Personal Tutoring


In a first-quality university, the professors own view of the material and one-to-one explanations tailor the presentation to each students needs. The value now is not just a map of the information space, but a guided tour. Value is embodied in the learning dialogues that make use of those educational resources that provide scaffolding, offer clarification and refinement, and help coordinate and integrate information.

Whats the

Value Proposition
Editors Note: Marian Petre and Mary Shaw co-authored this column.

of Distance Education?
tional on-campus undergraduate education and consider how they are handled in distance education.

Feedback and Forcing Functions


Structured assignments with deadlines and evaluation provide discipline, progression, and personalized guidance. Multiplechoice exams are not a substitute. They can provide formative feedback for selfstudy, but a significant amount of teaching is embodied in the more detailed, personalized feedback to assignments provided in supported distance learning.

Background
Considerable attention has been given recently to the online offerings such as edX and Coursera by prestigious universities such as MIT, Stanford, and Harvard. Those offerings are typically wrapped in words such as expert and world class. But heres the rub: if a job candidate came to you and said he or she had watched an assortment of TED lectures and passed some automated tests, would you trust that the candidate was educated? Marian ran a straw poll on that question at ICSE, at a recent social gathering, and at the local pub, and the reply was resoundingly no except in the few cases when it was well, that depends on what else the student does, and further criteria were suggested. We discussed this at ICSE, and Mary suggested considering the value proposition, a business notion that identifies the value the organization promises to deliver and why the customers believe theyll receive that value. It considers both the cost to the business and the value to the customer. Whats the value proposition of distance education? Lets consider the elements that contribute value to a tradi-

Access to Information
This is essential, but widespread access to the Internet is making information increasingly available at decreasing cost. Further, it is far from sufficient. Anyone whos been in distance education for a while knows that being a library or a repository for resources (no matter how good, and no matter how sexy the initial market appeal) is not enough to establish value. Why not? Because education is not simply the transmission of information. Few people can genuinely educate themselves just through access to materials. (Consider sending a 16-year-old to the library to learn calculus from a textbook.) Most people will learn some stuff, and some of them may learn quite a lot, but few will reflect, assimilate, and organize that stuff into a coherent body of knowledge they can access and deploy effectively. Those who can, have often had some earlier exposure to education.

Liberal Education
One of the principal value elements of an on-campus education is an approach to intellectual engagement with the world. As John Henry Newman described it: a habit of mind is formed which lasts through life, of which the attributes are freedom, equitableness, calmness, moderation, and wisdom; of what... I have ventured to call the philosophical habit of mind. That engagement is achieved through study, reflection and discourse (i.e., intellectual dialogues that influence our view of things). Discourse nurtures sense-making, critical depth, and intellectual skill. One might argue that having those dialogues with engaged minds over time, overseen and shaped by a true teacher, is the difference between learning some stuff and an education.

Selection and Structuring of Significant Information


Designing a course or writing a textbook provides a map through the forest of available information. This adds the

Network of Professors, Peers, and Opportunities


Some students (and many distance students) prefer to learn alone; however,

26 acm Inroads 2012 September Vol. 3 No. 3

featured columns

many want to engage not just with a body of knowledge but also with a community of like-minded people, others who share a passion for the subject, who care about learning, who want to engage in discourse. For many, the community persists beyond the educational experience, giving them an ongoing social network and valuable professional contacts.

tember through November) and breaking the bonds of geography. Putting material online can liberate the resource from the constraints of an academic timetable, both for the academic who creates the material, and for the student who uses it. (Its a further challenge to liberate the assessment from the constraints of an academic

Credentials
One of the principal value elements for a first-quality university is the credential issued by the university: the diploma, certificate, or transcript. A university degree is supposed to mean something, to signify competence and knowledge. Being there is not enough; the educational system is meant to corroborate that a standard has been achieved, providing others with a basis for confidence or trust that the certified individual actually possesses said competence and knowledge. Thats the hierarchy of values as seen by a traditional student. How are they addressed in distance education?

Value Elements
Perhaps the most sensitive value element for distance education is the credential. The value of the credential is intimately linked to the reputation of the institution. It therefore behooves the institution to ensure its standards are maintained. This is crucial for on-demand offerings, especially if there are no entry requirements, and if students are attracted from all over the world. As important a value proposition as the credential is, it is not yet clear how free distance offerings will address it effectively and sustainably, although a variety of proposals about automated testing and peer assessment are in circulation (and were already seeing reports on how easy it is to cheat on on-line, multiple-choice exams). There are also interesting questions about whether the credential can be separated effectively from provision, so that one organisation assesses learning based on anothers materials. Two areas in which distance education offers value over campus-based education are breaking the bonds of synchronization (class will be offered from 10:00 to 11:30 on Tuesdays and Thursdays from Sep-

As important a value proposition as the credential is, it is not yet clear how free distance offerings will address it effectively and sustainably, although a variety of proposals about automated testing and peer assessment are in circulation (and were already seeing reports on how easy it is to cheat on online, multiple-choice exams).
timetable.) With sufficient investment, new material that addresses emerging topics and issues can be produced promptly and with authority, increasing the responsiveness to a changing world and discipline. These reductions in synchronization and geographical constraints do not, however, reduce the need for good teachers attuned to students. On the contrary, they reinforce the need for excellent teachers who are able select and structure information in order to support learning at a distance, often without all the backups and contextual support available on a campus.

So whats the value proposition as viewed by those aforementioned prestigious universities that are making substantial and highly publicized online offerings? They are unlikely to withdraw their own bricks-and-mortar education and degrees, to replace them, to compete with them, or even to supplement them significantly. For those universities, the online courses are not core business, but something else: prestige enhancement, global advertising, assertion of expertise. The BBC News caught it in their headline: Top US universities put their reputations online (20 June 2012). The value of these universities face-to-face degrees is not threatened, but asserted and potentially reinforced, by putting their lectures online after all, students dont tend to enroll in those institutions for their lectures. They expand their reach without necessarily diverting their core value proposition, which is still based in bricks-and-mortar, where the dialogues, relationships, direct experiences, and crucial social networks are located. Its also based in part in selectivity. (Note that the outcome for these online programmes is currently not an MIT, Stanford, or Harvard degree and in some cases is a certificate rather than any university credit or a degree.) So what are the value elements for the institution offering distance education? Thezre are several, including the ones highlighted by the recent high-profile offerings. Some are to do with institutional business: Visibility, Profile, Marketing: Putting our wares on display can be a means of attracting students, investment, reputation. Scale and Reach: Communications technology potentially makes education available globally (modulo language and cultural differences), at a scale unimaginable for a campus-based university. Excluding the Competition: The funding model of the high-profile free offerings is not likely to be affordable by many institutions, and people tend to opt for prestige. Learning Analytics: Distance education platforms are potentially a huge educational observatory (or laboratory). They collect student data, providing an unparalleled lens onto student interaction with materials.

2012 September Vol. 3 No. 3 acm Inroads 27

fea t u re d c o l u m n s
Whats the Value Proposition of Distance Education?
Teaching as a Means of Recruitment: Knowledge about students that can be exploited, linking online learning networks to employer networks but this assumes that employers are convinced that the online offerings are either able to identify employable students or are able to enhance employability. (Note that distance education, although it tries to exploit technologically-based social networks, struggles to compete with the sorts of embedded, influential, social networks that are one of the key value propositions of prestigious traditional universities.) Some are to do with educational resources: On-demand Access to High-quality Material: Ideally, this material is focused and shaped for independent study. One of Courseras selling points is that material is chunked into ten-minute segments, in recognition of likely consumption patterns for those fitting learning around jobs, travel, family commitments. One of the old, old lessons about distance education is that its simply not good enough just to put your recorded lectures online, but that distance education demands material oriented to the medium and mode of study. Market Testing: Online offerings can be an effective means of road testing different course ideas, in terms of both digestibility and market appeal. Some are to do with education proper, with developing informed, critical minds: Dialogues: Established distance education institutions tend to find that providing effective dialogues is a challenge and requires significant investment. Crowdsourcing can sometimes work, but it isnt reliable without the scaffolding, monitoring, and moderation provided by skilled teachers. Community: Distance education can provide a community unlimited by geography although the challenge is to shape a community that is relevant and personal to its members. The scale, diversity and reach that can empower can also provide profound challenges, not least establishing appropriate cultural and behavioural expectations. They explained the OU mission, distance education that exploited computing and communications technologies, supported open learning, 250,000 students. He responded immediately: Well, Ive never heard of you, but in ten years time, thats what well all be doing. The bottom line is that distance education is now intimately entwined with online education. Online technologies open great opportunities for (notionally global) dissemination of information. The internet levels the field, giving access to a wealth of resources. Textbooks, recorded lectures, and online course notes have low incremental dissemination cost, which is a principal enabler of the free and low-cost programs. However, online technologies have not yet gained much leverage on personalization. Tutoring and feedback and the dialogues that nurture critical thinking are individual and hence laborintensive, so dont expect to see them in mass-market, low-cost programs. The network of personal contacts arises largely from informal activities and is trickier to establish at a distance; relations are deeper and more personal than most socialnetwork connections. Credentials and certification continue to present a challenge. Information access may be relatively cheap in this milieu, but high-quality education still requires substantial investment. If, as Sussman predicted, well all be doing it, then wed better strive for education, not just learning some stuff. Ir
Acknowledgements Many thanks to everyone who discussed this with us or contributed thoughts, including: Dennis Frailey, Neil Smith, David Bowers, Mike Richards, Blaine Price, and Hugh Robinson.

Conclusion
This discussion recalls a conversation that Marian and a colleague had with Gerald Sussman at MIT in 1994, when the OU was interested in the way MIT used Scheme in introductory programming. When they introduced themselves, he asked: Whats this Open University? And what does distance education mean?

NSF Digital Library

NSDL

Marian Petre The Open University Milton Keynes MK7 6BJ United Kingdom m.petre@open.ac.uk

www.nsdl.org
28 acm Inroads 2012 September Vol. 3 No. 3

Mary Shaw School of Computer Science, Carnegie Mellon University 5000 Forbes Avenue, Pittsburgh Pennsylvania 15213-3891 USA Mary.Shaw@cs.cmu.edu

DOI: 10.1145/2339055.2339064 Copyright held by author.

featured columns

Percolations
Lisa C. Kaczmarczyk
We should strive to gain all knowledge possible just because we can. It is not the business of a scientist to take into consideration what is going to be done with what s/he discovers. A good scientist makes the discoveries and then it is up to a larger body of people (someone else) to decide what to do with it. Purity of the scientific process dictates that decisions and methods not be contaminated with tangential future possibilities. NMP. Not My Problem. That is what the pure scientist would say. In case you dont quite see the enormity of the sentence the way I do, lets talk about this for a moment. Wedging my way into the head of the LinkedIn writer as best I can I speculate he intends to say: Science is for science implies knowledge for knowledges sake. A nice idea if you live in a global vacuum divorced from realities of the human condition. Gone are the days when we can plausibly pretend that groundbreaking discoveries can be controlled by any government or corporation. You would think that in the 21st Century, in this world of constant conflict and global terrorism, scientists, especially scientist educators, would know better than to stick their heads in the sand. Havent we learned? Why does

An Alternate View of

Science
Science: 1a. The observation, identification, description, experimental investigation, and theoretical explanation of phenomena. 1c. Such activities applied to an object of inquiry or study. 4. Knowledge, especially that gained through experience
(American Heritage Dictionary)

Technology is for people, but science is for science.


(Writer, LinkedIn computing education forum)

almost always has to do with doing something. What is the object of that doing? People, society, the environment, culture. There isnt anything else is there? Anything that we can wrap our heads around is related in some way to us. Can you think of some motivation for science that has nothing to do with our conception of reality? Lets get back to the claim that we strive to obtain knowledge for the sake of knowledge. On the surface it sounds like a plausible motivation. The line of thinking goes as follows:

Science is not for people. With beliefs like that no wonder we have problems recruiting and retaining students in computing curricula. No wonder we face an uphill battle to attract women and other under presented groups. If science is not for people, then what is science for? For itself? Apparently, because the phrase science is for science is a self-referencing statement. Does that make any sense, really? Ask yourself: Why do we engage in scientific inquiry? So that we can do things with the results, right? Who is we? We is Us, we are people. Until such time as an extraterrestrial species cross breeds with earthlings, then we is unambiguously human. Why do humans engage in science? Good things, bad things, the motivation for conducting science and it

2012 September Vol. 3 No. 3 acm Inroads 29

fea t u re d c o l u m n s
An Alternate View of Science
anyone still believe it is ok to turn our back on the ethical and moral implications of scientific investigation? If you hold fast to the misguided ideal that somehow scientists can do their work in a purist vacuum, you havent faced the logical fallacy. It makes no logical sense whatsoever to say that science is for science and not for people. The premise is fatally flawed. Why? Ask yourself: Do I enjoy working on scientific problems? If you said yes, working on science makes me feel good, then you have just acknowledged that science is for you. YOU! If you do not enjoy working on science then there must be some other motivation behind your engagement with it. Unless you are an extraterrestrial in disguise then you are a person. Conclusion: Science is for people. Why should our students care about science? As educators, we always come back to this all important question. We want our students to care, we work endlessly to engage them in the excitement of science. But how can they get passionate about science if science appears to exist in a vacuum? We know they do not: how many times have we heard that students dont see the point in their coursework? In Computer SCIENCE? We know students care about the world around them we need to give them reasons to care about

Why should our students care about science? As educators, we always come back to this all important question. We want our students to care, we work endlessly to engage them in the excitement of science.
science in relation to their world. What do you do? Because there are people out there who still promulgate our now discredited sentence, you must percolate in a creative productive way. You must find a way to turn the claim Technology is for people and science is for science on its head for your students. You tackle it head on in your classroom. Debunk a discredited notion while simultaneously integrating ethics, society

and computing. Become really creative and develop a challenge for your students: have them develop a program around the idea of developing appropriate structures related to our challenge phrase. For example What in the realm of introductory programming is notoriously self-referencing? Recursion! What in the realm of data structures is notoriously circular? Circular linked lists! Make them have to think about this phrase in all its mind bending perspectives and then code up a visually engaging representation using recursion and/or a circular linked list. Dont stop there got a better idea for an implementation? Go ahead I bet you could publish a conference paper based on the results of your pedagogical innovation. Your students will never forget you and they will never forget to think twice about the inter-relationship between science and people. Ir

Lisa C. Kaczmarczyk Consultant San Diego, California 92130 USA lisak@acm.org

DOI: 10.1145/2339055.2339065 Copyright held by author.

Need more info about computing? CRA can help

Computing Research Association


www.cra.org
30 acm Inroads 2012 September Vol. 3 No. 3

featured columns

Community College Corner


Elizabeth K. Hawthorne

Communicating and Collaborating with

Colleagues Online
One of the observations identified in the joint ACM/NSF Report of Findings from the Strategic Summit on the Computing Education Challenges at Community Colleges [1] reads as follows: Computing departments at community colleges are distinguished by small numbers of full-time faculty highly specialized curricula, over-reliance on the specialized expertise of part-time faculty, low enrollments in advanced courses, and continuous program and course revision.
Photo: w w w.iStockphoto.com / pressureUA (bot tom)

computing education, whose founders identify as a goal: a distributed portal providing access to a broad range of existing educational resources for computing while preserving the collections and their associated curation processes. We want to encourage contribution, use, reuse, review and evaluation of educational materials at multiple levels of granularity and we seek to support the full range of computing education communities including computer science, computer engineering, software engineering, information science, information systems and information technology as well as other areas often called computing + X or X informatics. In similar fashion, the ACM CCECC Facebook page (ACMccecc) provides a venue for the exchange of ideas and commentary by anyone with a Facebook account. The ACM CCECC is pleased to report that from our homepage one can now seamlessly enter both Facebook at the ACMccecc presence and the Ensemble portal at the ACM Computing Education in Community Colleges group forum; simply click on the Online Communities tab at www.capspace.

ing Education in Community Colleges (CCECC) Facebook page (www.facebook. com/ACMccecc). Each of these resources provides opportunities for two-year college computing faculty to connect with their colleagues over matters of shared interest. Both are readily available via the Online Communities tab on the CCECC website at www.capspace.org [2]. In particular, the Ensemble Computing Portal is a relatively new NSDL Pathway [3] sponsored by the National Science Foundation intending to establish a national, distributed digital library for

Even this single observation albeit one of several points made in the Report detailing the challenges confronting two-year college computing faculty is sufficient justification for the development and facilitation of effective communications avenues among communities of community college faculty members. It is, therefore, compelling to take note of two such vehicles currently in place: (1) the ACM Computing Education in Community Colleges group forum available via the Ensemble Computing Portal (www.computingportal.org/acmcec); and (2) the ACM Committee for Comput-

2012 September Vol. 3 No. 3 acm Inroads 31

fea t u re d c o l u m n s
Communicating and Collaborating with Colleagues Online
org and select the desired environment. The various topic areas found in both the Facebook CCECC setting and the Ensemble community college forum are facilitated by Prof. Becky Grasser, an Associate Member of the ACM CCECC and Department Chair and Professor of IT & Computer Science at Lakeland Community College. Anyone familiar with Dr. Grasser knows that she is very active in her field, promotes the diverse and free exchange of ideas, and is always ready with an engaging quip! Anyone interested in receiving correspondences and news alerts from the AMC CCECC on its initiatives, projects and advocacy efforts is strongly encouraged to become a CCECC Affiliate; simply supply your contact information at capspace.org/affiliate. By this quick action you can ensure that you remain familiar with ACMs work in the domain of community college computing education. Of course it cannot be overlooked that there are many additional opportunities to exchange ideas with colleagues in this manner; the avenues noted above are but a starting point. The most important idea I want to underscore is that as a computing faculty member in a community college, your colleagues are but a click or two away, eager to exchange ideas with you, share curricular resources and pedagogical ideas, and collaborate on efforts to improve the computing education for todays two-year college students. Ir
References [1]  Hawthorne, E. K., Campbell, R. D., Klee, K. J., Wright, A. M. (2011). Digitally Enhancing Americas Community Colleges: Strategic Opportunities for Computing Education. Retrieved from www.capspace.org/SummitReport. [2]  ACM Committee for Computing Education in Community Colleges. (2009). CAP Space - Curriculum, Assessment and Pedagogy: www.capspace.org. [3]  The National Science Digital Library. (2008). Ensemble Computing Pathway: www.computingportal.org

Math CountS
Peter B. Henderson

Alan Turing
This issues column is a tribute to Alan Turing in the year of the 100th anniversary of his birth. I have not read any of his works or books about his life, and basically am familiar with an overview of his life, including his theoretical contributions to the study and limitations of algorithms, contributions to the war effort breaking the Enigma code, the tragic personal aspects leading to his death in 1954, and the recent posthumous apology by the British government. Accordingly, this column will not be a historical perspective, but rather some thoughts based upon what I have gleaned from his work. I will strive to keep this column short and focused. If we could interview Alan Turing today, I think he would be both gratified and a bit horrified by the evolution of the discipline he pioneered. Ask any computer scientist to name one of the pioneers of our discipline and Alan Turing would probably be identified frequently such as the ACM Turing Award, the Turing Machine, and the Turing Test. Was Turing a computer scientist, a mathematician, or both? In his time, he was considered to be a mathematician, but then the term computer science did not evolve until after his death.1 The foundations of
1

Mathematician/Computer Scientist?

Alan Mathison Turing our discipline are rooted in mathematics as many of the pioneers of computing were mathematicians. Today, mathematics is still important for the science of computing exploration and innovation, but not used effectively in everyday software development to ensure quality. If we could have a conversation with Alan Turing today I think the latter would be a bit unsettling for him. For such a conversation to be meaningful, Turing would have much catching up to do. Of particular relevance would be the area of mathematics, specifically discrete mathematics and logic, and their connections to computing. He studied the limitations of algorithms, but I wonder if he had any thoughts regarding the correctness of algorithms? Of course, this would require precise ways to describe the behavior of an algorithm along with tools for arguing their correctness. Here, I intentionally used the word argue instead of

Elizabeth K. Hawthorne Computer Science Department Union County College Cranford, New Jersey 07016 USA Hawthorne@ucc.edu

DOI: 10.1145/2339055.2339066 Copyright held by author.

 The first use of the term computer science is attributed to Louis Fein in 1959. However, the best known early use of the term is often attributed to George Forsyth in 1961.

32 acm Inroads 2012 September Vol. 3 No. 3

featured columns

prove since I believe the former is pedagogically more palatable for students. It is interesting that with some effort students can understand the theoretical limits of algorithms, for example, the halting problems, yet struggle with basic arguments of correctness. One reason is that students are not adequately prepared in basic logic and proof techniques. Another is that computer science educators are generally not well versed in this area, or if they are, either dont know how to convey these concepts, or dont feel they are that important. But then again, most undergraduate math majors are not proficient with proofs upon graduation, so we shouldnt expect our majors to be able to prove the correctness of their programs. However, they should understand and be able to apply general good practice concepts such as pre and post conditions, iteration, data, and class invariants, and design by contract. This is also connected with the complex nature of software systems, something I believe Turing would not have predicted, and even he would have difficulty grasping. So, if Turing were presented with the issue of demonstrating the correctness of a software system, how might he have proceeded? Perhaps his work on decoding Enigma provides some insight, build a machine (software system) which can demonstrate the correctness

of a software system. This would lead to another interesting theoretical question for Turing, knowing that the system terminates for all input, is this computationally possible and/or even feasible? The feasibility issue might have lead Turing to the P vs. NP problem. If building such a machine (software system) is computationally possible and feasible, then why arent all software developers using them today? Basically because the problem is mathematically difficult! Here we come back to Turing again, who would thrive on this mathematical challenge, but would have soon found the task daunting, as have other researchers. Just designing a comprehensive software specification language is hard, let alone the backbone of a theorem prover, which takes as input the software system specification and the system itself, using some universal software system language, and results in YES or NO. Of course, such a machine (software system) is not very user friendly reminds me of programming with punched cards when one often got a compiled or did not compile result. The next insight Turing might have had comes from his famous Turing Test. Maybe this checking software system (dropping the word machine now) could somehow communicate with the software developer as if it was a human. It would provide guidance to the developer by pointing out

inconsistencies and perhaps even providing suggestions for corrections, that is, a software developers assistant. The human aspect of such an assistant is a long way off, however, simplified versions of such tools are evolving, and I envision they will be widely used by the developers of critical software systems within the next few years. Hopefully, computer science and software engineering education will be able to prepare software practitioners for this brave new world. Would Turing think this is the end of the journey? Probably not! I recall one of my introductory course lectures where I would illustrate the ultimate software development tool by picking up a microphone and describing, in natural language, the functionality of the software system to be developed. An unattainable dream perhaps, but then nobody knows what the future might bring, except for Alan Turing whose vision, insight and inspiration still guide us. Ir
Peter B. Henderson Emeritus, Department of Computer Science & Software Engineering Butler University Indianapolis, Indiana 46208 USA phenders@butler.edu

DOI: 10.1145/2339055.2339067 Copyright held by author.

Check out the


Computer History Museum


www.ComputerHistory.org

2012 September Vol. 3 No. 3 acm Inroads 33

fea t u re d c o l u m n s

Colorful Challenges
David Ginat

Rectangle Cover
The current issue involves an optimization challenge, with simple geometry entities. The selection of a suitable solution should be handled carefully. We would like to obtain the simplest and most efficient solution; but, not at the expense of correctness. Following the presentation of the new challenge, we devise, through a series of observations, a graph algorithm for solving the first of the two challenges presented in the previous issue. A key element in the presented solution is the utilization of auxiliary elements, which considerably simplify the computation. The devised solution exemplifies the assets of using auxiliary elements in algorithm design. Due to lack of space, the solution of the second challenge of the previous issue will be presented in the next issue. squares needed to cover the given rectangle. For example, for the rectangle 710, the output will be 7 (one possible cover with 7 rectangles is: two 55 rectangles (that cover a strip of 510), and five additional 22 rectangles (that cover a strip of 210); an additional cover with 7 rectangles is: one 77 rectangle, three additional 33, and three additional 11 rectangles).

A key element in this scheme is suitable selection of the starting points, which yield a minimal number of paths. How should we select these starting points? If we have some leaf nodes in G (i.e., nodes of degree 1), then it is obvious to start from them. But perhaps we do not have such nodes. And even if we do, is this observation sufficient for obtaining our goal? Not quite. There may be different cases, of many leaves, a few leaves, or no leaves. We may generalize the latter observation. If we start a path from an odddegree node, then we will not end the

Previous Challenge Solution Challenge


Edge Chains Given an undirected graph G, we define a chain as a path in G in which no edge appears more than once (a node may appear more than once). Devise an algorithm that outputs the edges of G in chains, such that the number of chains is minimal. (Note that if we do not seek a minimum number of chains, then we may just output each edge as a single chain.) Solution The challenge may be viewed as a task in which one needs to output paths of edges (which may include cycles) in the given graph G. One way to approach the task is to start walking through G, from some node, as long as it is possible, until a deadend is reached, i.e., a point from which it is impossible to proceed with a new edge. The used edges will be removed from G; a new starting point will be chosen, and a new path through G will be visited. This will be repeated until all Gs edges are used.

What about evendegree nodes? We may have an even number or an odd number of these nodes. Will it be beneficial to start paths from these nodes? In order to answer this question, we may recall some familiar algorithmic scheme.
path in that node, since in any future visit of this node, we will have an outgoing edge not used so far. Moreover, a path that starts in an odd-degree node will end in another odd-degree node. In addition, the total number of odd-degree nodes in G must be even, as the total sum of the node degrees in G is even (each edge contributes 2 to this sum). What about even-degree nodes? We may have an even number or an odd number of these nodes. Will it be beneficial to start paths from these nodes? In order to answer this question, we may recall some familiar algorithmic scheme. A relevant familiar scheme is that of

New Challenge
Rectangle Cover Given a rectangle of integer width and length, devise an algorithm to compute the minimal number of integer-side

34 acm Inroads 2012 September Vol. 3 No. 3

featured columns

computing an Euler cycle. In a graph in which all nodes are of an even degree, one can perform an Euler path, which may start anywhere, go through all the edges, and end in the starting point. If the graph has two odd-degree edges, then an Euler path will start in one of the odddegree nodes and end in the other. The latter characteristic may hint to us that perhaps it is sufficient to start only in odd-degree nodes in order to go through all Gs edges. So, we may repeatedly seek the odd-degree nodes as starting points for the visited paths. We also have to make sure that each path is as long as possible. How can we perform this elegantly? Again, we may turn to the algorithm of computing an Euler path. If our original graph G is suitable for an Euler path, then we are done (with one output chain). Otherwise, we may transform our original graph into one that suites an Euler path, with an auxiliary construction. We may connect every two odd-degree nodes with an artificial, auxiliary edge, apart from two such nodes. Then, we may perform a single Euler path, which includes the artificial edges. (The path will be between the two remaining odd-degree nodes.) After obtaining this path, we will remove the auxiliary edges. The result will be our desired output chains of minimal number. In retrospect, we may notice several problem solving notions. First, the solution process started with an initial observation, which was gradually generalized and refined for the task at hand. Second, it was beneficial to relate the given the task to a familiar scheme (Euler path), and capitalize on that scheme. And third, the notion of auxiliary elements (in our case artificial edges) played a key role in devising the final elegant solution. Ir

C A P Space.org
u r r i c u l u m s s e s s m e n t e d a g o g y

The ACM Committee for Computing Education in Community Colleges


Serving Computing Education Communities since 1991

David Ginat CS Group Science Education Department Tel-Aviv University Tel-Aviv, 69978, Israel Ginat@post.tau.ac.il

DOI: 10.1145/2339055.2339068 Copyright held by author.

CAPSpace.org
2012 September Vol. 3 No. 3 acm Inroads 35

Bits & Bytes


Demonstrating Random and Parallel Algorithms with Spin
Mordechai (Moti) Ben-Ari and Fatima Kaloti-Hallak
Recent versions of the Spin model checker support search

2  Implementing a Naive SAT Solver in Spin


The input to a SAT solver is a formula in conjunctive normal form: a conjunction of disjunctions called clauses. For example:
(a V b) & (~a V ~b) & (a V ~c) & (~a V c) & (c V ~d) & (~c V d) & (b V ~d) & (~b V d)

diversity, where the depth-first search of the state space of computation can be randomized. We show how to use this feature to demonstrate the efficacy of random and parallel algorithms when applied to SAT solving: finding a satisfying interpretation of a formula of propositional logic.

1 Introduction

It is straightforward to implement a naive SAT solver for this set of clauses in Promelathe modeling language of Spinusing nondeterministic if-statements to choose an assignment:
active proctype sat() { bool a, b, c, d, result; /* Select an assignment nondeterministically */ if :: a = true :: a = false fi; if :: b = true :: b = false fi; if :: c = true :: c = false fi; if :: d = true :: d = false fi; /* Compute the truth value of the set of clauses */ result = (a || b) && (!a || !b) && (a || !c) && (!a || c) && (c || !d) && (!c || d) && (b || !d) && (!b || d);

In [2], the first author presented a survey of model checking, showing that it can be effective for teaching concurrency and nondeterminism. In this article, we extend this claim and demonstrate how the Spin model checker can be easily used to demonstrate the efficacy of random and parallel algorithms. We suggest that the reader review [2], in particular Section 8, before continuing. The example used is that of a naive SAT solver: a program that searches for a satisfying interpretation for a formula of propositional logic. The SAT problem is central in theoretical computer science because it was the first problem to be proved NP-complete. The SAT problem is also of significant practical importance because many applications can be easily encoded in propositional logic. Although the SAT problem is NP-complete, formulas arising in practice can be efficiently solved using a variety of sophisticated algorithmic and programming optimizations. See [5] for an introduction to SAT solvers. A model checker such as Spin [1] takes as input a nondeterministic finite automaton that models a nondeterministic (or concurrent) computation and a correctness specification written as assertions (or as a formula in temporal logic). It searches the state space of the execution of the automaton for a counterexample: a state that falsifies an assertion. The search is performed depth-first; since the automaton is nondeterministic, at each node there are several outgoing branches. Normally, the search tries these in a fixed order, but recent versions of Spin support trying the branches in random order. As you will see, the results can be surprising.

printf(Result = %d\n, result); assert(!result) }

When a verification of this program is performed in Spin, the model checker performs a depth-first search, checking all possible assignments of truth values to the atomic propositions. If a satisfying interpretation is found, the assertion is violated (result is true, so !result is false) and the verification terminates. Otherwise, the model checker backtracks to continue the search for a counterexample. Since this set of clauses is unsatisfiable, a verification will

36 acm Inroads 2012 September Vol. 3 No. 3

bits & bytes

terminate successfully after searching the entire state space. See Sidebar 1 that summarizes the relevant concepts.

Sidebar 1
Efficient nondeterministic algorithm (generate and check) Generate an assignment: a=false, b=true, c=true, d=false. Evaluate the formula (this takes very little time) and check if result is true:

result =

(false V true) & ... (~true V false) = true & ... & false = false

By definition of a nondeterministic algorithm: if there exists a computation where result is true, then the formula is satisfiable.

Inefficient deterministic implementation (brute-force search) For all of the 2n assignments of true or false to the n atomic propositions, compute result. Use a fixed ordering (such as lexicographic ordering) to ensure that you check all possible assignments. If the formula is satisfiable, a satisfying assignment will eventually be found, but in the worst case the formula must be evaluated for all 2n assignments.

checker to large problems. They suggest using search diversity and parallelism: running many verifications in parallel on a multicore machine, where each run differs in the way the search is conducted. As noted in [4], one need not actually run several verifications in parallel to become convinced that search diversity will work. It is sufficient to run a number of verifications equal to the number of processors that are assumed to be available and to examine the execution times. We carried out an experiment with search diversity applied to the naive SAT solver. We configured the verifier to search the transitions from each state randomly and ran each verification with several seeds. The unsatisfiable Tseitin clauses give upper bounds on the resources needed, while the satisfiable variants are intended to represent practical problems that have a solution that we are looking for. Eight variants and eight verifications (with different seeds) for each variant were run for Tseitin clauses corresponding to K5,5 and K6,6. As noted above, the unsatisfiable Tseitin clauses for K5,5 took 48 seconds to verify. Out of the eight satisfiable variants, seven were verified in just a few seconds, but one variant showed more interesting behavior. The verification of this variant using the nonrandom algorithm took 37 seconds, while the random verifications finished in the following times. 11, 6, 6, 39, 35, 2, 12, 39. Suppose that you have a computer with four cores running the random algorithm with four different seeds. At worst, the running times would be 39, 35, 12, 39, so the program would output an answer in only 12 seconds, a third of the time of the non-random sequential algorithm. With a bit of luck, the answer would be received in 6 or even 2 seconds. With eight cores, of course, the answer would be received in 2 seconds. While these improvements might not seem impressive, a real problem could see its running time reduced from 37 hours (you would probably cancel the run before receiving the result ...) to a reasonable 2 hours! The clauses for K6,6 demonstrate more realistic scenarios. Recall that the unsatisfiable clauses could not be verified, so we are not surprised that the results for the satisfiable variants are not uniformly positive. Nevertheless, they are encouraging in some cases, as shown in Table 1. The verifications of variants 3, 5 and 7 remain infeasible, but those for variants 2, 4, 6, 8 would terminate on a four-core processor because any choice of four out of the eight seeds includes at Table 1
Variant 1 2 3 4 5 6 7 8 21 21 0 0 0 20 0 Time ( = out of memory) 0 0 0 0 0 21 21 21 0 0 21 0 0 21 0 21 0

For this small set of clauses, the verification terminates immediately, but the set is a member of a family of sets of clauses called Tseitin clauses, which are built from arbitrary large connected undirected graphs. The details of the construction are beyond the scope of this note; the interested reader is referred to [3, Section 4.5]. The above set of clauses was generated from the complete bipartite graph K2,2. A program was written to generate Promela programs for Tseitin clauses corresponding to Kn,n for any n and to generate variants of the programs, where a random literal is complemented in each variant, thereby making the set of clauses satisfiable. For the unsatisfiable sets of clauses, the search must try all possible assignments of true or false to the atomic propositions and this demonstrates the exponential nature of the naive algorithm for SAT. The verification of the programs for the clauses associated with K2,2, K3,3 and K4,4 terminated immediately, whereas for the clauses associated with K5,5 the verification took 48 seconds. An attempt to verify the clauses associated with K6,6 terminated for lack of memory.1

3  Tseitin Clauses

4  SAT solving with Search Diversity

Holzmann, Joshi and Groce [4] showed that time, not memory, is currently the limiting factor is the application of the Spin model
1

 ll times are rounded to the nearest second. We used a garden-variety Windows A PC like the one a student might use: a low-end Intel i3 processor with two cores and 4 GB of memory. If you use more powerful computers, simply scale up the problem to use Kn,n for larger n.

2012 September Vol. 3 No. 3 acm Inroads 37

Bits &  Bytes


least one that terminates in at most 21 seconds. The verification of variant 1 would succeed on an eight-core processor because for one seed out of the eight, the verification terminates immediately. Sidebar 2 expresses random search and parallel search diversity in terms of throwing dice.

6  Conclusion

Sidebar 2
Random search diversity Compute result for all the 2n assignments taken in some random order. If you get lucky, the random search will find a satisfying assignment faster than a search in a fixed ordering! For example, if the fifth assignment out of six is satisfying, then the fixed-order search will perform 5 steps to find it, but if you throw a die to determine where to start the search, you might get lucky and start from 4 or even 5. Parallel random search diversity On a multicore computer, compute result in parallel for different random orders of the 2n assignments. This increases your chance of getting lucky quickly. For example, if you throw four dice at the same time, the probability of getting a 5 is much higher (1-(5/6)4=62%) than if you just throw one die (17%).

Random algorithms are rather unintuitive for students brought up to think of algorithms as sequential deterministic procedures, or as a set of deterministic processes in the case of concurrency. Furthermore, while concurrency as the time-shared execution of high-level processes is also familiar, demonstrating speedup from parallelism is more difficult. We have shown how the support for search diversity in Spin makes it very easy to demonstrate both randomness and parallelism for solving a nondeterministic algorithm. The programs described in this paper are open-source and can be downloaded from: http://code.google.com/p/mlcs/. Ir References [1]  Ben-Ari, M. Principles of the Spin Model Checker. Springer, London, 2008. [2]  Ben-Ari, M. A primer on model checking. ACM Inroads, 1(1):4047, 2010. [3]  Ben-Ari, M. Mathematical Logic for Computer Science (Third Edition). Springer, London, 2012. [4]  Holzmann, G.J., Joshi, R., and Groce, A. Swarm verification techniques. IEEE Transactions on SoftwareEngineering, 37(6):845857, 2011. [5]  Malik, S. and Zhang, L. Boolean satisfiability: From theoretical hardness to practical success. Communications ACM, 52(8):7682, 2009. Mordechai (Moti) Ben-Ari
Department of Science Teaching Weizmann Institute of Science, Rehovot 76100 Israel
moti.ben-ari@weizmann.ac.il

5  From Search Diversity to Parallelism

Even on a two-core processor, parallelism is easy to demonstrate by opening multiple command windows. Consider the variant of the program for the clauses associated with K5,5 that took 37 seconds to verify using the standard search method. It was run simultaneously in two windows: one with the seed that led to a 35-second execution and one with the seed that needed only about 12 seconds. Since they ran in parallel on the two cores, the faster verification finished in just over 12 seconds, while the slower verification continued to run. However, when two slow verifications were initiated in parallel with the fast verification, the latters run time increased to almost 18 seconds, indicating that it was time-sharing the cores with the other programs.

Fatima Kaloti-Hallak
Department of Science Teaching Weizmann Institute of Science, Rehovot 76100 Israel
fatima.hallak@weizmann.ac.il

Categories and Subject Descriptors: D.2.4 [Software/Program Verification (F.3.1)]-Model checking; D.1.3 [Concurrent Programming]Parallel programming General Terms: Algorithms Keywords: model checking, Spin, random algorithm, multicore
DOI: 10.1145/2339055.2339069  2012 ACM 2153-2184/12/09 $15.00

Dozenal Society of America


www.dozenal.org
38 acm Inroads 2012 September Vol. 3 No. 3

Call for Participation to ACM Inroads


Consider submitting to this ACM magazine on Computing Education
Types of Submissions
> Comprehensive Articles Approximately 4000 to 7000 words; an extensive study or report on an important topic of relevant interest demonstrating several original and novel research characteristics. > Standard Articles Approximately 2000 to 4000 words; a study or report on a relevant topic exhibiting some elements of research closure. > Bits & Bytes Approximately 1000 to 2000 words; unsolicited short narratives illustrating the results of a study or educational experiment. > Peripherals Approximately 250 to 1000 words; short announcements or other information relevant to the purpose of this magazine. > Letter-to-Editor Approximately 100 to 500 words; an unsolicited opinion (op/ed) piece addressing a particular view on a topic of relevant interest.

Submit articles to: http://mc.manuscriptcentral.com/inroads

sta n d a rd a r t i c l e s

Dynamic Programming of the Towers

Timothy Rolfe

programming optimization for solving recurrence problems, this paper shows its application to one of the most famous recurrences, that of the Towers of Hanoi. Because of code complexity, the top down implementation is initially slower than the recursive, but is faster for sizes above 7. The dynamic programming implementations have initially linear behavior, moving eventually to exponential. The topdown implementation at one point requires only 1% of the time for the recursive implementation.

After a brief overview of the dynamic

For n > 0 and 0 k n: For 0 < k < n:

C(n,0) = C(n,n) = 1 C(n,k) = C(n1,k) + C(n1,k1)

Given a divide-and-conquer problem formulated in terms of combining slightly smaller subproblems for the final solution, one might be tempted to implement directly the recursive definition. Fibonacci numbers provide an example: F0 = 0; F1 = 1; for all n > 1, Fn = Fn 1 + Fn2 This is spectacularly inefficient [3]. One can prove that the number of method calls required for this calculation is 2Fn+1 1.1 Similarly, one can look at binomial coefficients, C(n,k). The formulation in terms of Pascals Triangle has this recursive definition:
1 2

Brief Review of Dynamic Programming

Again, this is spectacularly inefficient [1]. One can prove that the number of method calls required for this calculation is 2C(n, k) 1.2 Since the cause of the inefficiency is multiple calculations of the same quantities, the solution is to insure that each quantity is only calculated once using something called dynamic programming. Top-down dynamic programming (also called memoization) allows retention of the recursive structure while computing each quantity only once [3]. This can be accomplished by having memory set aside to hold values for previous calculations. The cells initially contain a value that flags a particular cell as not having been computed (0 in the F(int i) method below). In C one can use static arrays which retain their contents between function calls, but in Java one must use a class array outside of the Java method to get this behavior. For Fibonacci numbers, this can easily be handled by having a vector at the class level. Since a zero value flags an uncomputed cell, Fib(0) must be handled explicitly, as shown in the following code fragment.

static final int maxN = 92; static long knownF[] = new long[maxN+1]; static long F(int i) { if (i < 2) return i; // So knownF[k]==0 can flag an uncomputed cell if (knownF[i] == 0) }
3

knownF[i] = F(i-1) + F(i-2);

return knownF[i];

http://penguin.ewu.edu/cscd320/Topic/Recursion/FibonacciRecurrence.html
= = = = = = = = 2*C(0, 0) 1 2*1 1 = 1 2*C(n, n) 1 2*1 1 = 1 Calls(n-1,k) + Calls(n-1, k-1) + 1 2*C(n-1,k)1 + 2*C(n-1, k-1)-1 + 1 2*(C(n-1, k) + C(n-1, k-1)) - 2 + 1 2*C(n, k) 1

Calls(0,0) Calls(n,n) Calls(n,k)


3

Base case Second base case Recurrence for 0 < k < n Substitute inductive hypothesis Binomial coefficient recurrence

3] Program 5.11 revised based on the definition of Fibonacci numbers used here and 64 bit integers. See also http:// [ penguin.ewu.edu/cscd320/Topic/Strategies/DynamicPgming/Fibonacci.java

This trades space for time. In return for spending linear space to hold the vector of knownF, one computes Fibonacci numbers not in exponential time but in linear time or faster, since once the vector is populated Fibonacci numbers are available in constant time. A similar approach can be used for the binomial coefficient problem,

40 acm Inroads 2012 September Vol. 3 No. 3

standard articles

but will not be given here.4 This approach has the desirable characteristic of on-demand computation: a particular subproblem is computed only if it is needed for a larger problem. Thus the table of computed binomial coefficients may have numerous cells that are never filled because they are never needed.

Briefly, the Towers of Hanoi begins with three towers/pegs and n disks of varying diameter arranged so that they are initially all stacked on one tower, in decreasing size from bottom to top. They are to be transferred to another tower subject to two constraints: the disks are moved one at a time, and no disk is ever on top of a smaller one [2]. With a small shift to self documenting variable names to correspond with later code segments, Sahnis implementation of the Towers of Hanoi becomes the following code:
public static void towersOfHanoi(int n, int src, int dst, int {// Move the top n disks from tower src to tower dst. // Use tower tmp for intermediate storage. if (n > 0) { towersOfHanoi(n-1, src, tmp, dst); System.out.println(Move top disk from tower + src to top of tower + dst); towersOfHanoi(n-1, tmp, dst, src); } }

Connecting to the Towers of Hanoi

This is a classical exponential problem requiring 2n 1 disk movements to transfer all disks from src to dst by way of tmp. So the answer is inescapably exponential to get some representation of the series of moves. With dynamic programming, however, // The class static matrix memo[nRow>n][6] contains solutions one can trade exponential space for less than exponential static String hanoi(int n, int src, int dst, int tmp) time, at least for a region in which dealing with exponen{ // Initialization code omitted . . . tial space does not require exponential time. To solve the n if (memo[n][index] == null) disk problem, one can have a memoization matrix of n+1 { // Two-digit data movement treated as base-3: 01 through 21 rows and 6 columns5 to contain subproblem solutions. The // Get the magnitude of the base-3 number, store as octal problem solutions will be represented by character strings. String addend = String.format(%1o, 3*src + dst);

the transfer of all disks from the initial source tower to the final destination tower. Thus for the transfer of n disks from src to dst, using tmp for intermediate storage, the solution will be the transfer of (n1) disks from src to tmp, using dst for intermediate storage, followed by the transfer of one disk from src to dst, and ending with the transfer of (n1) disks from tmp to dst, using src for intermediate storage. This represents the solution for (n1) from src to tmp using dst, concatenated with the single digit representing the move from src to dst, and then concatenated with the solution for (n1) from tmp to dst using src. For instance, the solution for moving four disks from tower zero to tower two by way of tower one is represented thus, the middle move in bold and italic: 125167125365125. Shown as digit pairs, this is 01,02,12,01,20,21,01,02,12,10,20,12,01,02,12. Seven moves uncover the bottom disk, which is moved from zero to two, and then seven moves cover it over. Figure 1 shows all these steps.6 The first move solves the n=1 problem from 0 to 1, the first three, the n=2 problem from 0 to 2, and the first seven, the n=3 problem from 0 to 1. The memoization matrix then becomes a six-column matmp) trix of strings. Each column represents a particular permutation of the available source, destination, and temp towers: 012, 021 up through 201 and 210. Each row represents the number of disks being moved. Thus row zero contains + the base case: no disks being moved, which for computational convenience is represented by the empty string. Rows of higher index contain six cells for the six possible problems. If the cell contains a null reference, that problem has not yet been solved. For a given triple (a,b,c) as (src, dst, tmp), the solution is the (n1) solution (a,c,b) to clear those disks from a onto c, followed by the single move from a to b, ending with the (n1) solution (c,b,a) to move the cleared disks on top of the single disk that was just transferred. The Java code segment that follows shows this:

} Since the storage of earlier results is critical in return memo[n][index]; minimizing the space required, each move is represented by } 7 a single character. Every disk movement operation is treated as an ordered pair of tower designations, giving the source and destination towers, ranging from 01 up through 21. If these are 6 These were generated by the program at http://penguin.ewu.edu/~trolfe/  considered to be base-3 numbers, the range of magnitudes runs from DynamicHanoi/Four_Disk_Hanoi.java. http://en.wikipedia.org/wiki/File:Tower_of_ 1 through 7, and can be stored as a single octal digit. The solution then Hanoi_4.gif is a very nice animated gif that Andr Karwath added to the Wikipedia becomes a string of octal digits representing all moves to accomplish page on the Towers of Hanoi 22 March 2005. http://penguin.ewu.edu/cscd320/

String result = hanoi(n-1, src, tmp, dst) + addend + hanoi(n-1, tmp, dst, src); memo[n][index] = result;

Solution Representation

 ee http://penguin.ewu.edu/cscd320/Topic/Strategies/DynamicPgming/Binom_ S Memo.java Given movement of disks among towers 0, 1, and 2, there are six permutations of  those digits.

Topic/Recursion/HanoiBAS.exe provides a very primitive (character graphics) animation of the problem. It is specific to Microsoft Windows and is a 32-bit executable, not runnable in a 64 bit environment. Source code is in http://penguin. ewu.edu/cscd320/Topic/Recursion/HanoiBAS.bas http://www.qb64.net/ provides a way to exercise BASIC code in a 64-bit Windows environment. The full program is available in http://penguin.ewu.edu/~trolfe/DynamicHanoi/ 

2012 September Vol. 3 No. 3 acm Inroads 41

sta n d a rd a r t i c l e s
Dynamic Programming the Towers

Figure 1: States in solving Hanoi(4, 0, 2,. 1)

If one allows for a trailing \0 character to terminate a string, the solution for moving n disks requires 2n characters. Generating the solution, however, requires combining three strings, two of them of size 2n1. Java obscures the expense of combining strings that is much clearer in C: the space must be allocated from the heap (malloc/alloc/calloc), and then the contents must be copied into it (strcpy/strcat). For short strings, the expense of memory allocation dominates, giving behavior linear in the number of disks. Though memory copying is fast, for large enough strings that expense, linear in the size of the string, will dominate, giving exponential time for exponential space.

The function calls are linear, thanks to memoization. By direct measurement, Calls(1) = 3, Calls(2) = 7, and then for n>2, Calls(n) = 6n 3.
Table 1: Memoization Matrix Cells Used
Table of results actually computed Row 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X 012 021 X X X 102 120 201 210

Examination of the final contents of the memoization matrix shows that slightly less than half of it is used. Table 1 shows the cells of the matrix that are used when solving the problem of moving twenty disks from tower 0 to tower 2. Row 0 contains the base-case solutions, empty strings. The top-most row contains only the cell explicitly computed (021), which then generates two cells in the row below that (012 and 120). After that only three of the six cells are computed. Since the solution for k disks requires 2k characters, one can explicitly compute the space required. The top-most row contains one 2n string. Below that are two entries with 2n1 strings, representing another 2n characters. Below that is simply three times the summation of powers of two.

Other Requirements for the Top-Down Solution

The total space required is thus about 32n.

42 acm Inroads 2012 September Vol. 3 No. 3

standard articles

The bottom-up dynamic programming approach is to build successively higher levels of solutions starting with the level of the base case or the base cases. In the process, there is no need to retain lower levels once the higher levels are computed that rely on them. Sedgewick notes In the case of Fibonacci numbers, we can even dispense with the array and keep track of just the previous two values (see Exercise 5.37). [3, p. 219] Here is a specimen implementation.
static long fib(int n) { long v[] = { 0L, 1L, 1L }; // Compute until fib(n) is in v[0] while (n-- > 0) { v[0] = v[1]; v[1] = v[2]; } return v[0]; } // Count down // Shift downward v[2] = v[0] + v[1];

Bottom-Up Dynamic Programming Solution

The garbage collector is quite busy in this implementation: all solutions are discarded except for the final two sets, six for the n1 problem and six for the n problem. The six for the n1 problem require 62n1 characters, or 32n, while the six for the n problem require 62n more, for a total of 92n characters. The total space required for the top-down approach, discussed above, is about 32n characters. There is only one function call, since the solution is iterative.

6 7 8

Other Requirements for the Bottom-Up Solution

Requirements for the Recursive Solution

By the nature of the problem, the recursive stack only goes as deep as the number of disks. The space requirement is simply 2n for the length of the string representing the moves, but the garbage collector is busy recycling intermediate solutions. The number of function calls, of course, is exponential. Calls(0) = 1, Calls(n>0) = 2 Calls(n1) + 1 = 2n+1 1.9

Similarly, for the Towers of Hanoi one can retain just two rows from the memoization matrix, the previous case computed (initialized to empty strings) and the current case being computed. These then are swapped in working to the next higher number. An implementation follows:
static String botHanoi(int n, int s, int d, int t) {

How, then, do these three implementations stack up against each other? Three separate Java programs were developed to capture time statistics for the three different implementations discussed above (top-down, bottom-up, and recursive).10 Each size was forced to execute for at least two seconds to average over multiple runs for fast-executing sizes. To allow for extremely long strings, the Java Virtual Machine was initialized with both initial and maximum heap size as 1 GByte:
java Xms1024m Xmx1024m <program> },

Experimental Results

// 6 empty strings for 01 through 21; null for 00 and 11 String[] prev = { null, , , , null, , , curr = new String[8], temp; // compute into this

// interchange prev and curr

int i, // Outer loop on level of solution j, // Inner loop for the six problems at level i offset = 3*s + d; // Position of final answer // Final result will be in prev for (i = 1; i <= n; i++) { // Populate curr from prev for (j = 1; j < curr.length; j++) { // 11 is not allowed int src = j/3, dst = j%3, tmp = 3 - src - dst; if (src == dst) continue; // I.e., 11 // Omit 00

The results are interesting. The dynamic programming implementations begin with a linear region before turning exponential. The programs were run on an unloaded computer in the Computer Science Department at Eastern Washington University under Linux.11 Table 2 shows the results12 of one set of executions for problems from size 2 through size 25. It is useful to have the explicit numbers,

//h(i-1,src,tmp,dst) + src to dst + h(i-1,tmp,dst,src) curr[j] = prev[src*3+tmp] + (src*3+dst) + prev[tmp*3+dst]; } // Swap curr and prev. temp = curr; } return prev[offset]; }
8

curr = prev;

prev = temp;

The full program is available in http://penguin.ewu.edu/~trolfe/DynamicHanoi/ 

 alls(0) = 21 1 = 1 C Base case Calls(n) = 2 Calls(n-1) + 1 Recurrence = 2 (2n 1) + 1 Substitute inductive hypothesis = 2 n+1 2 + 1 = 2 n+1 1 QED 10 Available in http://penguin.ewu.edu/~trolfe/DynamicHanoi/ as TopHanoi.  java, BotHanoi.java, and RecHanoi.java. 11 Processor (from /proc/cpuinfo):  Intel Xeon CPU 5160 @ 3.00GHz two processors reported Operating System (from /proc/version): Linux version 2.6.32-24-server (buildd@yellow) (gcc version 4.4.3 (Ubuntu 4.4.3-4ubuntu5) ) #43-Ubuntu SMP Thu Sep 16 16:05:42 UTC 2010 Java version (from java -version): Java SE Runtime Environment (build 1.6.0_26-b03) Java HotSpot 64-Bit Server VM (build 20.1-b02, mixed mode) 12 The Excel workbook: http://penguin.ewu.edu/~trolfe/  DynamicHanoi/25Avg.xls
9

2012 September Vol. 3 No. 3 acm Inroads 43

sta n d a rd a r t i c l e s
Dynamic Programming the Towers
since the logarithmic scale in the graph exaggerates small numbers and minimizes large numbers. The time reported is in elapsed milliseconds, available through the Java method System.nanoTime().
Table 2: Runs from 6 June 2012
Size 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Top-Down 0.021 0.038 0.048 0.064 0.079 0.094 0.110 0.127 0.147 0.171 0.206 0.259 0.352 0.517 0.837 1.472 2.826 5.889 13.866 31.728 74.903 168.563 360.890 1254.996 Bottom-up 0.001 0.001 0.002 0.003 0.004 0.006 0.010 0.019 0.036 0.068 0.130 0.253 0.497 0.987 1.983 4.098 9.270 21.505 50.037 120.463 280.264 832.855 1807.628 4476.174 Recursive 0.003 0.007 0.013 0.027 0.055 0.108 0.215 0.447 0.873 1.738 3.697 7.061 14.061 29.181 57.461 115.054 254.244 480.490 1029.500 2193.638 4114.886 8389.754 17008.208 33808.957
Acknowledgments These computations were performed on otherwise idle computers in the Computer Science Department at Eastern Washington University. I would like to thank Bojian Xu, Ph.D., Assistant Professor of Computer Science at Eastern Washington University, for his helpful comments as I developed this paper. I also wish to thank the anonymous reviewers of this paper for their helpful suggestions and Prof. John Impagliazzo, the editor-in-chief of ACM Inroads for his help in wrestling this paper into final form. References [1]  Rolfe, Timothy J., Binomial Coefficient Recursion: The Good, and The Bad and Ugly, SIGCSE Bulletin (inroads), Vol 33, No. 2 (June 2001), pp. 35-36. [2]  Sahni, Sartaj, Data Structures, Algorithms, and Applications in Java, (2nd edition; Silicon Press, 2005), pp.320 323. [3]  Sedgewick, Robert, Algorithms in Java, (3rd edition; Addison-Wesley, 2003), pp.219-226.

The top-down appears to be showing approximately linear behavior up through size 10, and then begins increasing, reaching purely exponential at size 18. The bottom-up has a linear region through size 7, and is purely exponential above size 13. The recursive implementation, as expected, is exponential throughout. Figure 2 shows these results graphically. The Ratio plot is the Top-Down time divided by the Recursive time, showing the speed-up with the dynamic programming implementation that performed best at larger sizes.

These same algorithms can be implemented in the C language.13 In that environment, however, the programmer is responsible for memory management, while the Java language provides a garbage collector. On the other hand, as a compiled rather than interpreted language, C tends to generate faster-executing code. Figure 3 shows the times captured from the C programs equivalent to the Java programs that generated Figure 2.14 Compared with the Java implementations, the C implementations are blindingly fast. Ir

Performance of C Implementations

Timothy Rolfe
Professor of Computer Science Emeritus Eastern Washington University Cheney, Washington 99004-2493 USA

Get Connected with


http://penguin.ewu.edu/~trolfe Timothy.Rolfe@mail.ewu.edu
Categories and Subject Descriptors: D.2.8 Metrics Performance measures F.2.2 Nonnumerical Algorithms and Problems Computations on discrete structures General terms: Algorithms, Performance Keywords: Dynamic Programming, Memoization, Towers of Hanoi, Optimization DOI: 10.1145/2339055.2339070 2012 ACM 2153-2184/12/09 $15.00

Computing History
Visit the IEEE History Center and Virtual Museum at

13

www.ieee.org/museum

14

The C environment on the computer used is gcc. gcc -version reports the  following. gcc (Ubuntu 4.4.3-4ubuntu5.1) 4.4.3 Copyright 2009 Free Software Foundation, Inc. http://penguin.ewu.edu/~trolfe/DynamicHanoi/ provides access to the Excel  workbook, 25Avg_C.xls, and to the three C programs: TopHanoi.c, BotHanoi.c, and RecHanoi.c. The data in 25Avg_C.xls show that compiling using gcc O3 only improves the recursive implementations performance.

44 acm Inroads 2012 September Vol. 3 No. 3

standard articles

100000 10000 1000 100 10 1 0.1 0.01 0.001


Top-Down Bottom-up Recursive Ratio

10

15

20

25

Figure 2: Measured execution times in milliseconds for the three Java implementations

1000 100 10 1 0 0.1 0.01 0.001 0.0001


Top-Down Bottom-up Recursive Ratio

10

15

20

25

Figure 3: Measured execution times in milliseconds for the three C implementations

2012 September Vol. 3 No. 3 acm Inroads 45

sta n d a rd a r t i c l e s

A Day One Computing for the Social Good Activity

Michael Goldweber

requires many different kinds of efforts. While most initiatives are focused at secondary schools, the undergraduate curriculum also has importance. Specifically, activities in the first computer science course (CS1) need to dispel common myths and misunderstandings. It is believed that students perceive computing as boring and irrelevant, and furthermore, they select their major, at least partially, on their desire to have a positive societal impact. We present the Reuniting Families Problem, a CS1 first-day activity that presents computer science as an interesting, algorithmcentered, group-work oriented discipline with a high degree of relevance that speaks to students values and desires regarding societal impact.

Success at broadening participation

Broadening participation in computing is a difficult problem (which is well beyond the scope of this paper to comprehensively address). A partial list of contributing impediments include gender stereotyping, general misunderstanding of the discipline, job opportunity myths including outsourcing, perceived professional daily activities and responsibilities, lack of meaningful role models, relevance to students lives, the uncoolness factor, and the supposed need for the geek gene. One study reports that STEM oriented high school students describe (their perceptions of ) the computing discipline(s) as boring, tedious and irrelevant [14, 24]. Many outreach programs, aimed at the secondary school audience, attempt to combat these misconceptions. These include CS-Unplugged [1], CS4FN [7], LEGO League(s) [15], Digital Divas [6], Project Impact [17, 11], the Computer Science Inside Project [8], and the Bebras contest [9]. This partial enumeration does not include the design, development and eventual (effec-

INTRODUCTION

tive) utilization of visual programming tools such as Alice [20], Scratch [19], and Kodu [18]. Hence there are many factors related to the question of broadening participation. One additional factor is to what degree does a given undergraduate curriculum either reinforce the impeding myths and misconceptions or work to dismantle them. While it is important to examine the curriculum as a whole, it is also vital to examine what one does in the first course (CS1) with regard to this issue. The computer science education (CSE) literature is full of different approaches to CS1. One particular initiative with regard to broadening participation, is the media computation approach [13, 23] that endeavors to place computing in the context [16] of students lives. Another approach, one that some might also define as contextualized, is Computing for the Social Good (CSG) [12]. CSG is an umbrella term meant to incorporate any activity, from small to large, that endeavors to convey and reinforce computings social relevance and potential for positive societal impact. This approach taps into reported students desires to pick a major which they believe will allow them to have a positive societal impact [4]; as opposed to the misunderstood belief that computing is irrelevant. The observation that students use perceived positive societal impact in their choice of major is one proposed explanation for the 4x selection of social science majors over computing [4], in spite of the greater than 4x job opportunities in computing over those with degrees in the social sciences [5]. Furthermore, as demonstrated by the Glitz project [10], success in broadening participation may be improved when computing is shown to connect with students values rather than their interests. Continuing in this vein of examining the impact of an undergraduate curriculum in general and CS1 in particular, with regard to both the dismantling of myths and the presentation of computing in a particular light, one might conclude that what one does on the first day of the first course (i.e. the first day activity) has serious repercussions with regard to successfully broadening participation. The CSE literature is also rich with regard to first day activities (e.g. [2, 3, 22]), though the focus has been primarily on core computing concepts (e.g. objects and OO design) rather than

46 acm Inroads 2012 September Vol. 3 No. 3

standard articles

dispelling myths. We propose that computing educators consider a different kind of first day activity; one that appeals to students values and illustrates computings potential for positive societal impact. Towards that end an example first day activity is described in detail. It is hoped that first day activities, such as the one proposed, will set the tone for an undergraduate program that not only presents computing as relevant to those drawn to either puzzles, mathematical abstractions, games, or commerce/business, but as a relevant discipline with the potential to have a deep positive societal impact. The remainder of this paper is organized as follows. Section 2 presents a brief summary of other first day approaches. Section 3 is a detailed description of the Reuniting Families Problem, the new CSG oriented first day activity being proposed. Finally, Section 4 presents the conclusions of this proposal.

Typically, given how little information is provided at the onset, many clarifying questions get posed.

There are, of course, other potential activities, e.g. working through a tutorial for a given programming language or IDE. The above itemization merely serves to juxtapose what the author, and by extrapolation many of her peers, have done in the past with the activity described in the following section.

A New First Day Activity: The Reuniting Families Problem For this first day activity, the

It is beyond the scope of this paper to provide an exhaustive taxonomy of first day activities. Nonetheless, it may be worthwhile to briefly enumerate a set of such activities the author is familiar with and/or has experimented with.1  Traditional lecture/discussion: While this approach is familiar to all, for purposes of completeness, we indicate that such lectures typically explore the friction between students beliefs and the formal definitions of such core concepts as: W hat computer science is actually about?  W hat is an algorithm?  The difference between a computer, computer science and a  computing agent. The rich history of algorithmics.   Algorithm focused KLA: A Kinesthetic Learning Activity (KLA) is a physically engaging classroom exercise. First day KLAs include having one student act as the computing agent, while the remainder of the class designs and runs an algorithm to get the agent to successfully tie their untied shoe. Similarly, providing an algorithm for the agent to construct a peanut butter and jelly sandwich is also popular [22]. For those who wish to avoid the inevitable mess, directing the agent to perform multiplication (either standard or a la russe) will also work.  Core concept focused KLA: A KLA might also be employed to illustrate the concepts of object oriented design [2, 3]. In this case, student agents, acting as objects, interact with each other based on object interfaces to accomplish some meaningful goal. (e.g. vending machine or game of blackjack)
1

First Day Activities

class is asked to consider a (natural) disaster, e.g. hurricane, tsunami, earthquake, terrorist attack.2 Furthermore, the location is a smallish city. Regardless of the specifics of the disaster, the local uncovered soccer/football/ rugby stadium is left intact and the aid workers have directed all survivors to congregate at this single undamaged stadium. It is assumed that the stadium is sufficient to hold all the survivors. The students, working in groups (e.g. 3-4) must devise a protocol, i.e. algorithm, for the aid workers to use to reunite the survivors of each nuclear family unit. One implementation that has worked well is to give the class the remainder of the initial session, approximately one hour, to work together as a group. Groups also get to continue working together for the first m minutes of the second session. After that, each group must describe their algorithm to the class as a whole and each group receives feedback from the class and the instructor before moving on to the next group. Typically, given how little information is provided at the onset, many clarifying questions get posed. The recommended answers the instructor provides are designed to simplify the problem as much as possible. Questions that often get posed include:

 ow many aid workers are there? H Answer: How ever many you want.  ow many entrances are there to the stadium? H Answer: How ever many you want.  ow many sections are there in the stadium? H Answer: How ever many you want.  hat can be assumed about peoples names? W Answer: Using the Latin alphabet, each family unit has a distinct surname, which all surviving members know and can spell: If there are n nuclear family units, there are n unique surnames. Hence, no spouse with different last names, no blended family, and an individuals brothers family would have a distinct surname.  s there a working public address system? I Answer: There is no power, but aid workers can have bullhorns.

 or our purposes we disregard first days where the instructor reviews the syllabus F and/or conducts a non-computing oriented ice-breaker. For the purposes of this paper, a first day activity is the initial activity used to convey something about computing.

O  ne might elect to connect to a recent event familiar to the student body.

2012 September Vol. 3 No. 3 acm Inroads 47

sta n d a rd a r t i c l e s
A Day One Computing for the Social Good Activity

 o survivors have cell phones? D Answer: Yes, but the towers have been knocked out and/or all the batteries are depleted - hence the phones are of no use.  f aid workers are told to record the names of each survivor, I say upon entry to the stadium, and possibly their seat location, perhaps assigned when entering the stadium, is there a way to automate the sorting and/or searching of this list? Answer: No automated processes is allowed. (e.g. Scan in survivor registration sheets, use OCR and then sort the list.) If sorting and/or searching is to be done by aid workers, it must be done by humans using processes described by the group.  an signs be made? C Answer: Yes.  an survivors, upon entry to the stadium, be directed to C distinct sections of the stadium? Answer: Yes, the survivors are capable of following any set of reasonable instructions.

Not only does the above set of simplifications reduce the problem to one the students can solve, but hopefully also illustrate important problem solving techniques: problem simplification or reduction and abstraction. Student groups are asked to be cognizant of how long it will take to run through their proposed solution with survivor counts of 100, 1000, and 80,000 how well does the solution scale?

Furthermore, while describing ones algorithm out loud to ones peers lacks the rigor and formalism of computer programming, it still conveys that not only are algorithms designed to be communicated to others, but that hand-wavy logic is a pitfall to be avoided. Students also begin to develop their understanding of algorithm efficiency. Inevitably there is at least one group that proposes a linear algorithm. Hopefully there will be at least one divide-and-conquer algorithm. If lucky, one or more student groups will illustrate the effective use of parallelism. Moving forward with this exercise one might also require each student, working individually, to write up for the next session, to the best of their ability, either their favorite algorithm that they heard (which might still be their own) or a new one of their own creation. A followup activity might be to assign a short paper on Googles Person Finder which was first used experimentally in Japan after the 2011 earthquake and tsunami. The exercise is finally concluded with an open-ended extra-credit, or prize-based challenge: Propose a solution that is more efficient than any the professor has conceived. As with most such open-ended challenges, only a percentage of a class takes up the challenge. Nevertheless, the individual conversations with those students engaged in the challenge are usually excellent. This is somewhat like a humanitarian-based David Ginat problem for introductory students. Hopefully, without being overbearing, this CSG exercise, while introducing many important core computing concepts, plants the important seed regarding the ubiquity of algorithms and the social relevance of algorithmic problem solving.

Students also begin to develop their understanding of algorithm efficiency. Inevitably there is at least one group that proposes a linear algorithm.
To focus on algorithm efficiency, students are told that the survivors are sitting in the stadium under a very hot sun. Hence, not only are survivors anxious to be reunited with their surviving family members, but they also need to get out of the sun (into Red Cross tents) as soon as possible.

3.2 Assessment
No formal assessment of this activity has yet been undertaken. We nonetheless believe that this activity is at least isomorphic to traditional first day activities (Section 2). One might elect to either develop multiplication a la russe or a solution to the reuniting families problem. The reasons no formal assessment has yet been undertaken are:  The authors home institution is too small to yield meaningful sample sizes.  A CSG activity such as The Reuniting Families Problem is not intended to exist in a vacuum. If this experience were the only CSG activity throughout the undergraduate curriculum or even just during the introductory course, it is assumed that this singular activity would have negligible affect on students perception of the discipline.  Incorporating a CSG perspective into a course or computing curriculum is a philosophical decision. After such a decision is made, assessment focuses on the question of how best to accomplish this goal. Given the scarcity of CSG activities, there are no other first day, introductory course activities to compare this exercise against. Future plans include the execution of a meaningful study comparing student perceptions about computing after taking a CSGinfused CS 1 course to a non-CSG version of the same course. Such a study would be of particular interest given that Rader, et. al. reported that students indicated a greater interest in programming games than humanitarian based projects [21]. The primary nega-

3.1 Utility of the Exercise


Hopefully, one can see that this first day activity not only shows the relevance of algorithmic problem solving, but also touches on many important computing concepts. Some of these include: the rigor and formalism of algorithm design, what constitutes a computable operation, repetition, selection, and problem simplification. Students are also exposed to the important notion that computing/algorithm design is done in a group setting, as opposed to the myth of the solo programmer working in a windowless cubicle.

48 acm Inroads 2012 September Vol. 3 No. 3

standard articles

tive result of this study was from CS seniors with very limited female representation. Nevertheless, there is some informal assessment that can be presented. Post-exercise informal surveys yielded a great deal of surprise on the students part.  That computer science is not solely concerned with the programming of games and billing systems.  That algorithimics is an ancient human activity.  That computer science is not just relevant to commerce and that the discipline has an important role to play in humanitarian pursuits.  That computer science might just be more interesting than originally thought.

Efforts to broaden participation must not only affect undergraduate curricula as a whole, but must in particular be applied to CS1 in general and the first day activity in particular. The myth that computing is only relevant for the solo programmer working to develop the next killer app (e.g. Mark Zuckerberg) is a powerful one that unfortunately gets reinforced by the popular media. CSG activities are one way to counter the negative myths and illustrate the deep connection between algorithmic problem solving/computing and being able to have a positive societal impact. This paper presents a unique first day CSG group activity, the Reuniting Families problem, which while not sacrificing coverage of core computing concepts has the potential for making a strong and lasting first impression regarding students understanding of the importance and relevance of computing. Ir
References [1]  Tim Bell. Computer science unplugged. http://csunplugged.org/. [2]  Joe Bergin, Mike Clancy, Don Slater, Michael Goldweber, and David B. Levine. Day one of the objects-first first course: What to do. In SIGCSE 07: Proceedings of the 38th SIGCSE technical symposium on Computer science education. [3]  Don Blaheta. Day one of the objects-first first course: What to do. In ITiCSE 09: Proceedings of the 14th annual ACM SIGCSE conference on Innovation and technology in computer science education. [4]  Michael Buckley, John Nordlinger, and Devika Subramanian. Socially relevant computing. In SIGCSE 08: Proceedings of the 39th SIGCSE technical symposium on Computer science education. [5]  Anthony P. Carnevale, Jeff Strohl, and Michelle Melton. Whats it worth? the economic value of college majors. http://cew.georgetown.edu/whatsitworth/. [6]  Annemieke Craig and Julie Fisher. Digital divas club. http://digitaldivasclub.org/vic/. [7]  Paul Curzon, Peter McOwan, and Jonathan Black. Computer science for fun. http://www. cs4fn.org/. [8]  Quintin Cutts, Muffy Calder, and Peter Dickman. Computer science inside... bring computer science alive. http://csi.dcs.gla.ac.uk/. [9]  Valentina Dagiene. Bebras contest. http://www.bebras.org/en/welcome. [10]  Betsy DiSalvo and Amy Bruckman. From interests to values. Communications of the ACM, 54(8), 2011. [11]  Mary Anne L. Egan and Timoth Lederman. The impact of IMPACT: assessing students perceptions after a day of computer exploration. In ITiCSE 11: Proceedings of the 16th annual conference on Innovation and technology in computer science education. [12]  Mikey Goldweber, Renzo Davoli, Joyce Currie Little, Charles Riedesel, Henry Walker, Gerry Cross, and Brian R. Von Konsky. Enhancing the social issues components in our computing curriculum: computing for the social good. ACM Inroads, 2:6482, February 2011. [13]  Mark Guzdial. A media computation course for non-majors. In ITiCSE 03: Proceedings of the 8th annual conference on Innovation and technology in computer science education. [14]  Mark Guzdial. Teaching computing to everyone. Communications of the ACM, 52(5), 2009. [15]  Dean Kamen and Kjeld Kirk Kristiansen. First lego league. http://www.firstlegoleague.org/. [16]  Jennifer S. Kay. Contextualized approaches to introductory computer science: the key to making computer science relevant or simply bait and switch? In SIGCSE 11: Proceedings of the 42nd SIGCSE technical symposium on Computer science education. [17]  Catherine Lang, Annemieke Craig, Jane Prey, Mary Anne L. Egan, and Reyyan Ayfer. Outreach

Conclusions

The myth that computing is only relevant for the solo programmer working to develop the next killer app (e.g. Mark Zuckerberg) is a powerful one that unfortunately gets reinforced by the popular media. CSG activities are one way to counter the negative myths and illustrate the deep connection between algorithmic problem solving/ computing and being able to have a positive societal impact.
programs to promote computer science and ICT to high school and middle school students. In ITiCSE 11: Proceedings of the 16th annual conference on Innovation and technology in computer science education. [18]  Microsoft Research. Kodu - Microsoft Research. http://research.microsoft.com/en-us/projects/ kodu/. [19]  MIT Media Lab. Scratch home: Imagine, program, share. http://scratch.mit.edu/. [20]  Randy Pausch. Alice.org. http://www.alice.org/. [21]  Cyndi Rader, Doug Hakkarinen, Barbara M. Moskal, and Keith Hellman. Exploring the appeal of socially relevant computing: are students interested in socially relevant problems? In Proceedings of the 42nd ACM technical symposium on Computer science education, SIGCSE 11, pages 423428, New York, NY, USA, 2011. ACM. [22]  Sam Rebelsky. Food-first computer science: Starting the first course right with PB&J. In SIGCSE 07: Proceedings of the 38th SIGCSE technical symposium on Computer science education. [23]  Beth Simon, Pivi Kinnunen, Leo Porter, and Dov Zazkis. Experience report: CS1 for majors with media computation. In ITiCSE 10: Proceedings of the fifteenth annual conference on Innovation and technology in computer science education. [24]  Sarita Yardi and Amy Bruckman. What is computing? : Bridging the gap between teenagers perceptions and graduate students experiences. In ICER 07: Proceedings of the 3rd International Workshop on Computing Education.

Michael Goldweber
Department of Computer Science and Mathematics Xavier University, 3800 Victory Parkway Cincinnati, Ohio 45207 USA
mikeyg@cs.xu.edu
Categories and Subject Descriptors: K.3.2 Computer and Information Science Education[Computer Science Education] General terms: Design, Experimentation Keywords: Kinesthetic Learning Activities, In-class Exercises, Socially Relevant Computing DOI: 10.1145/2339055.2339071 2012 ACM 2153-2184/12/09 $15.00

2012 September Vol. 3 No. 3 acm Inroads 49

com p re h e ns i v e a r t i c l e s

A Freshman Level Course on Information Assurance: Can It Be Done? Heres How


Robin Gandhi Connie Jones William Mahoney

ffering a freshman level course in Information Assurance (IA) that is open to all majors in a University seems like a responsible thing to do. However, IA is considered as an advanced technical topic, and

its integration in undergraduate curriculums is primarily at the junior and senior level. Here we describe our experiences in designing and imparting a freshman level IA course. We discuss challenges and solutions for making the course appealing to a broad audience; strategies to increase enrollment; pedagogical techniques, and experiences from the past six semesters that such a course has been successfully taught at the University of Nebraska at Omaha (UNO).

50 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

The need for Information Assurance (IA) degree programs and their graduates is on the rise, with no signs of slowing down. While the need for IA specialists is apparent in the job market, little attention is being paid to general awareness of computer security issues among non-technical degree programs. The University of Nebraska at Omaha is designated by the National Security Agency as a Center of Academic Excellence in Information Assurance Education (CAE/IAE), as are many universities throughout the United States. One of the ramifications of this certification is the need to bring computer security and Information Assurance to the masses those not directly involved in IA or even in a technology-related field but use IT in their profession and/or personal lives. This posed a problem with the IA courses being of an upper level curriculum. Additionally, at UNO we offer Bachelors degree and Masters degree programs in Information Assurance. A long-standing and undesirable feature of our past IA degree programs was, until now, our undergraduate students were juniors; the IA faculty would have little to no interaction with them. Early interactions are important to keep students motivated in the degree programs, provide research directions and to retain them. At the freshman level, recruitment opportunities extend far beyond only the students in the computing disciplines. The IA faculty at UNO are involved in the teaching of IS and CS courses at the lower levels or the curriculum. This includes CS II ( Java based), Web Programming, Programming in C, and IT Ethics. In many cases, these courses were instrumental in fostering early interactions between IA faculty and students. However, these courses are still not early enough to be offered in the first semester. These courses do not have security topics as their primary focus, and as a result it becomes hard to gauge the student interest and passion about the security field as well as provide them the sufficient IA background so that they can join and contribute early to security related research projects. For these reasons we created, in the spring semester of 2009, a new freshman level class called IASC 1100 Introduction to Information Security. For brevity, we will refer to this course as IASC 1100 throughout the paper. The class is geared directly towards both incoming freshman students in the IA or related fields, as well as those students who want to learn about information security but come from any discipline within the University. To make the class even more accessible, desirable, and relevant for all degree programs at the University, students can use it to satisfy a General Education (GenEd) requirement. We expect this feature to attract a diverse student population with a broad spectrum of technological skills. This paper reflects the reasoning behind the creation of the class and some of the results we have achieved and feedback in the form of comments from students. Section two describes the (lack of ) current information security courses called out in ACM/IEEE curricula standards for the early parts of a typical undergraduate program. In section three we detail our particular needs, as described

INTRODUCTION

above, to interact early with the students so that they become aware of IA issues. Some of the authors had forgotten the challenges in dealing with brand new students in a university setting, and these issues are described in section four. In section five we describe our approach to satisfy the global diversity GenEd requirements at our university, followed by conclusions and acknowledgements.

ACM, AIS and IEEE curricula recommendations act as a benchmark for the body of knowledge to be disseminated in undergraduate (as well as some graduate) computing degree programs. In this section we discuss our findings for the early exposure to IA topics in the context of computing curricula recommendations.

INFORMATION SECURITY IN UNDERGRADUATE curricula

2.1  ACM/AIS/IEEE-CS Curricula Recommendations


CC 2005 [3] provides undergraduate curriculum guidelines for five defined sub-disciplines of computing: Computer Science (CS), Computer Engineering (CE), Information Systems (IS), Information Technology (IT), and Software Engineering (SE). Since all computing graduates cannot be proficient in all knowledge areas, CC 2005 recommends different weights for different sub-disciplines. With regards to such distribution IT provides the highest coverage of security implementation and management knowledge areas, whereas CS provides the highest coverage for security issues and principles knowledge areas. CS 2008 curriculum recommendations [1] now provide explicit focus on integration of security issues across its knowledge areas. Security is a focus not only in operating systems and networking knowledge areas, but also in programming to write safe and secure software. The recommended CS3xx Introduction to Computer Security course in CS 2008 requires a foundation of CS 102, and a co-requisite of data structures and algorithms in CS 103 as defined in CS 2001 [2]. As a result an in-depth treatment of fundamental principles of information security is only accessible to computing students late in their degree programs. In the early semesters, security topics are discussed only in the narrow scope of programing languages. The IT 2008 curricular recommendations [6] suggest two implementation strategies: 1) Integration first and 2) Pillar first. However in both cases Information Assurance and Security are only introduced after the IT pillars of programming (Programming Fundamentals), networking (Fundamentals of Networking), web systems (Fundamentals of Web Systems), databases (Fundamentals of Information Management), and human-computer interaction (IT Fundamentals of Human-Computer Interaction) are introduced. Again, this is a fairly late introduction.

2012 September Vol. 3 No. 3 acm Inroads 51

com p re h e ns i v e a r t i c l e s
A Freshman Level Course on Information Assurance: Can It Be Done? Heres How
Similar situations exist with IS 2010 [5], SE 2004 [7] and CE 2004 [4] curricular recommendation where security topics are either integrated into advanced courses or only available as an advanced elective.

3.1 Early Exposure to Faculty


Our IA undergraduate degree was finalized and in effect in the fall semester of 2007. Typically we would get to know our students well only when they became juniors or seniors. There were three principle reasons for the IA faculty-student interaction to take place in the junior years. First, we would encounter them regularly only when they would enter an IA specific class such as the CIST 4366 Foundations of Information Assurance. This is a Senior-level class, in part because of the number of prerequisites necessary (similar to CS3xx in CS 2008). Second, in addition to being a Center of Academic Excellence, we also participate in the National Science Foundations Scholarship for Service (SfS) program [8] a two year scholarship for students in the IA discipline. Since it is desirable to finish the scholarship during the semester of graduation, we typically interact with these SfS students the most during the final

2.2 IA Non-existent in the Early Years


While a substantial set of security topics are being integrated in the computing core, much work remains to push these out to other disciplines (business, arts and sciences, public health, criminal justice, music and many more) and still make them accessible for all students. A quick web browsing of undergraduate curriculums across several of the nations top IA schools indicates the general lack of IA courses in the early years. For example, schools such as Georgia Institute of Technology, Carnegie Mellon University, University of

The reasoning seems to be consistent with the ACM/IEEE/AIS curricular recommendations to push integrative and advanced topic such as security for later in the computing curriculums.
Texas Dallas, and Purdue University offer their first IA course at the Junior/Senior level. University of California Davis only offers IA courses at the graduate level. The reasoning seems to be consistent with the ACM/IEEE/AIS curricular recommendations to push integrative and advanced topic such as security for later in the computing curriculums. Meanwhile, advances in technology have brought the need for trustworthy computingcomputing systems that are inherently secure, available and reliableto most disciplines in higher education universities and sectors of the industry/government. It is not surprising that the issues of trustworthy computing have also become ubiquitous at every level of computing education. Correspondently, there is an urgent need for effective ways of addressing, teaching, and learning trustworthy computing concepts across all disciplines. two years of their degree program. Finally, the concentration or minor in IA offered to CS and IS degree programs, does not begin to take effect until the junior year. To foster undergraduate research and challenge the bright students that join UNO through the Scott Scholars program [18], a UNO scholarship, IA faculty want undergraduate students to assist in research and other activities such as capture the flag contests. The need for undergraduate students involved in research is further emphasized because UNO is a site for NSF Research Experience for Undergraduates (REU) in Computer Science and Information Technology [15]. The late faculty student interaction presents a noticeable problem here: in the past, we get the students up to speed on a research area, only to have them graduate. An obvious solution is to get them involved earlier in the process, which implies that we get to know them sooner. Thus we have one of our main motivational factors for the freshman level class, and that is self-interest!

Our motivation for the IASC 1100 class involves several factors, including the desire to get to know the students earlier, the opportunity to recruit students who are undecided about degree paths (or who are decided about degree paths but we obviously need to be careful in that area!) and just a wish to make students aware of the security pitfalls in the ubiquitous computing available today.

THE NEED FOR IA EDUCATION AT A FRESHMAN LEVEL

3.2 Recruitment Opportunity


A second driving factor in the creation of the class is recruiting. Many incoming freshman do not yet know what area interests them. These students will take a large number of classes in GenEd areas such as English or History, knowing that these credits are required for any undergraduate degree. The ramification for our IA program is that students may decide that they like English and declare a major, having never even heard of computer security as a discipline. While we like English, the world has plenty of English majors but does not yet have plenty of information security professionals.

52 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

This applies even more so to the students who enter the College of Information Science & Technology as undecided. They have made the decision to pursue a degree in the technology field but have not yet settled on just what area. It was found that often when an undecided student within the college was asked if he or she had considered IA for a major, the response would be: What is that? Offering a freshman level IA course provided those students with a chance to explore this otherwise unknown major within their field of interest and for us to recruit them to our particular degree program. Students who go through this class become ambassadors of the field and are in a better position to tell their undecided peers about it. These students sometimes contribute talks and demonstrations during high school field trips to the university and talk about the exciting new things they learned in the class. We thus also envision IASC 1100 as the class in which we at least have a fighting chance to recruit students, and make them aware that there is a degree program for them if the subject is interesting to them.

while some have taken Java, Visual Basic, or other programming languages in high school, hold networking certifications or have experience at the local geek squad, others may have no computer background other than playing Portal II and Counterstrike for long hours, or updating their Facebook page. To effectively present the material to the class we have identified four areas of focus: pay attention to the differences in technical background as mentioned, be aware of the need to build confidence in the students so that they are not afraid of breaking something or of new technology while exploring security issues, remember the fact that the topic needs to be interesting, and provide constant reminders concerning ethics.

4.1 Diverse Backgrounds


Some students just know that they will pursue a career path involving technology and start preparing for it in high school using selfstudy, certification or college courses. Generally, these students are excited to take up IA subject matter in their first semester. On the other hand students that have interests in non-technical disciplines generally enroll in the course out of a genuine interest in being able to secure their own computers and information but they lack the knowledge to do this. An often cited motivating factor for taking IASC 1100 is malware infections on home or work computers. In a freshman IA course it is crucial to account for a large diversity in technical backgrounds since no prerequisite knowledge can be assumed. On one end is a student pool that is amazed to see a command line interface in Windows XP and on the other end is a pool that has installed Backtrack distributions on their personal laptops. However, in most cases basic familiarity with mainstream operating system functions such as file system navigation, web browsing and document processing tools can be assumed.

3.3 General Awareness of IA Issues


Security and privacy issues are so prevalent with information technology that it makes sense for all students to have at least a general awareness of them before they graduate. By creating an IA course that is open to all majors, we have attracted students from the following degree programs in the past five semesters: Bioinformatics, Computer Engineering, Computer Science, Information Assurance, Information Systems, IT innovation, Music Performance, Psychology, Public Affairs and Community Service, and Studio Art. Irrespective of technology, a general aspect of this course that appeals to a broad audience is to understand the threat agent and their motives to launch an attack in cyberspace. Understanding the ways of the bad guys is essential to reason about the ways in which an information system can be attacked. Students examine cases of vulnerabilities including those in authentication mechanisms, social networks, and hardware in order to understand the modus operandi of a community of vandals, cyber mercenaries and nation states that are rooted in social, cultural, economic and political backgrounds. Students are trained in eliciting such scenarios with hands-on experience in security processes, tools and technologies; developing policies and procedures; and presenting cost-effective solutions to minimize risk.

4.2 Lab Exercises to Build Confidence


Going against the norm of introducing security courses later in the degree program has given us the opportunity to identify the balance between theory and hands-on exercises to discuss security topics much sooner. UNO infrastructure for IA education includes two Security Technology Education and Analysis Laboratories (STEALs), which are used exclusively for IA research and classes. These labs are self-contained and completely isolated from any public network. The lab computers are wiped clean and setup from scratch for each student session using Ghosting technology. These lab features provide isolation of dangerous technology from public networks, and limit the possibility of cross contamination between student projects and assignments. In addition to physical labs, Virtual STEAL is a lab environment which hosts lab exercises using virtual machines accessed using thin clients by remote participants. IASC 1100 is also delivered synchronously using Adobe Connect technology to students at the University of Nebraska Kearny campus. These students conduct lab exercises using the Virtual STEAL capabilities. Using these labs, students in IASC 1100 are provided hands-on experience in the following topics.

The difference between a freshman class and, say, a graduate level class is profound. Freshmen require a significant amount of hand holding, need to be reminded that attendance is sometimes important in the quest for a good grade, and seem to forever be handing in late assignments. Furthermore, the students at this level come from a mixed technological background;

CHALLENGES AND OPPORTUNITIES WITH A FRESHMAN LEVEL CLASS

2012 September Vol. 3 No. 3 acm Inroads 53

com p re h e ns i v e a r t i c l e s
A Freshman Level Course on Information Assurance: Can It Be Done? Heres How Web Vulnerabilities: SQL injection, Cross-site scripting and Cross-site request forgery (Week 3)
This lab is preceded by a lecture component that introduces HTML technology (GET and POST), JavaScript use cases and SQL syntax enough to understand and write simple exploits for the most egregious web vulnerabilities. The focus is to understand the drawbacks of improper or non-existent input sanitization in web applications. The lab exercise consists of executing SQL injection attacks on a vulnerable website belonging to a flower shop. Specific tasks guide the students in exploiting web vulnerabilities and recording private information such as customer names, addresses, and credit card numbers. Students also learn the ability to use the web proxy Pharos [16] to modify and examine HTML traffic between a browser and a web server. The lecture portion and lab guidance provide sufficient help for students with less technical proficiency to complete and at the same time enjoy the lab. The lab assignment solicits feedback from the students, which confirms our observation. The only complaint that students ever have is not having enough time to explore the web application vulnerabilities in greater depth! technical difficulty with general applicability in a limited amount of time. This includes exploration of system attributes using GUI tools as well as command line interface, exploration of hidden files, file permissions, files share, differences in file systems, and security event auditing. In fall 2011 we introduced exercises with Windows 7 for this lab, along with a demonstration of the Microsoft Security Compliance Manager. We expect to fully transition to a Windows 7 based hardening exercise in fall 2012.

Malware: Fake anti-virus detection and removal (Yet to be administered).


This lab uses real scareware captured in the wild to provide handson experience with malware detection and removal. A social engineering scenario is simulated in a web browsing session to start the infection. Observations are made once the malware infects the operating system and steps are taken to remove it. This lab is being developed as a direct response to an overwhelming demand from the students for a malware lab. Such interest was indicated in the feedback provided for the other labs.

Network Vulnerabilities: Port scanning and traffic sniffing (Week 7)


Two weeks of lecture on basic networking topics and vulnerabilities is followed by this lab exercise. We limit discussion to the TCP/IP model at enough depth to discuss ARP poisoning, DNS spoofing and TCP SYN flood vulnerabilities. As part of the lab exercise students use Nmap [14] to conduct a port scan on the lab domain controller and their partners computer. Scanning traffic is then examined in Wireshark [19] to make the observations required by the lab assignment. Influence of a firewall on scanning traffic is also examined. We have observed that students with a non-technical background have the most problems comprehending decimal to binary and decimal to hexadecimal conversions. This basic concept leaves them clueless when it comes to examining network traffic where one has to frequently make such conversions. Nonetheless, it is often an eye opening experience for students when they discover that passwords and other information sent in clear text over the network can easily be eavesdropped using freely available tools such as Wireshark.

4.3 Need for Coolness and Student Participation


It may not come as a surprise that most topics in computing can be a bit, well, DRY! Students at the freshman level are not particularly outspoken or bold enough to ask questions. Thus, with a freshman class things can go south quickly if the content is presented merely by lecturing through slide after slide. One thing that we do not want to do is drive people away from the IA major. We now present our experiences with some pedagogical techniques that have worked well in IASC 1100. One strategy that has worked well for us to get students involved and thinking about IA is through a daily assignment we call the News Notes section of the class. At the start of each class (except the first day and exam days), the initial 10 to 15 minutes is devoted to recent information security articles in the news. The students have to prepare for each class a one page assignment that covers one recent news article that is relevant to IA issues, one website that would be useful to an IA professional and one potential exam question based on past lectures. For the chosen news article students write a brief incisive paragraph that summarizes the lessons learned. All students should be prepared to discuss their news article and answer leading questions from the class. Sometimes students are so excited to share their news story that they volunteer, but on some occasions the instructor must pick a student at random. These articles serve to broaden the horizon of the students and bring them to the realization that information security is not just technical details. In fact, several aspects of this course are geared toward developing an understanding of cyberspace as a new medium that breaks all geographical boundaries. Security products such as firewalls also bring to light the difference in free speech and open information exchange beliefs in the United States and other nations versus regulated and filtered information content enforced in countries such as China and Iran.

Covert Channels: Steganography (Week 8)


Students have the most fun performing this lab. After a brief history on steganography and least significant bit hiding techniques, the lab progresses by instructing students to use the Steghide tool. Students examine and record the difference between clean and tainted files visually as well as with an MD5 signature tool. Finally, the lab assignment is to recover secret messages from within suspect files as well as to embed a secret message of their own in an image file.

Secure Configuration: Windows XP hardening (Week 13)


This subject could occupy an entire semesters worth of material; however in this lab we focus on limited OS features to balance

54 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

Through such open discussions, students come to the realization that IA is not just about the configuration of firewalls and phishing attacks. We hope that this will assist in the recruiting of students who previously assumed that technology is the only facet of the topic. While it may seem that students are doing the most work with News Notes, it does take skill and experience on the part of the instructor to facilitate discussion on any topic that may come up. Each article must be viewed as an opportunity to educate the class on IA topics using a real case study. These topics may or may not be part of the official syllabus but the instructor must be prepared to discuss the issues and offer additional pointers or insights. Our students amaze us with the diversity, technical depth and genuine curiosity reflected in these discussions. The discussions create an informal setting where questions can be raised and answered. Additionally, the opportunity for the instructor to relate a news note back to subjects previously discussed in class is an invaluable tool and helps reinforce the lectures with real world examples found by the students; with this they can see the relevance of the subject material presented in class. Another strategy to make the class engaging is through frequent guest lectures by other IA faculty at UNO and local industry IA professionals. The course typically includes six to seven guest lectures on topics including: phone phreaking, vulnerability discovery, cyber crime, cyber warfare, information system audit, incident response, physical security, policy and risk management, and disaster recovery. To account for the availability of guest lecturers we leave some flexibility in the course schedule. This aspect of the course introduces the class to several IA faculty and the topic that they will study in advanced classes as well as to an industry practitioners perspective. The class atmosphere lightens up quite a bit every time a guest lecture is scheduled.

One must be careful teaching something involving the H word. Principally the issue seems to be one of perception, but this perception is gradually changing.

4.4 Ethics and Teaching Hacking


One must be careful teaching something involving the H word. Principally the issue seems to be one of perception, but this perception is gradually changing. In 2003 the University of Calgary initiated a class called Computer Viruses and Malware, which promptly raised a fury among security firms such as Trend Micro. Some of the arguments against the course have wielded analogies, such as you dont teach someone how to break into houses in order to protect their house [10]. But just two years later, journal and conference articles began appearing arguing for teaching this material. The new thinking was teaching university level students how to hack is a legitimate means of identifying company network

weaknesses and preventing malicious attack can be an effective component of computer security programs [17]. By 2007, teaching hacking was becoming more mainstream, and universities such as the City College of San Francisco added similar coursework [13] and even reported upon it at DefCon. Still, many universities in the United States seem intent on teaching cyber security from a primarily defensive posture rather than an offensive one. The general perception in our case might be that since this is an introductory course, concerns about teaching hacking do not apply. Certainly the skill levels to become a competent computer virus author are beyond the freshman level. However, what others do not see is the component of the class where tools and techniques are demonstrated as a component in a larger lab exercise. A good case in point might be our exercise involving the popular Wireshark network analysis tool [19]. This tool can just as easily be used to debug a strangely behaving network as it is to monitor network traffic, including Facebook and other social networking sites, at the local coffee shop. Is this eavesdropping ethical? Some may view introductory material as not sufficiently advanced as to warrant any kind of special treatment. But it does not take too much skill to become a script kiddie. Some students may have joined the course with an attraction towards hacking. Therefore, we make a significant effort in the course to convey the message that one learns security assessment tools and techniques only so that we can think like the bad guy to discover and understand vulnerabilities, and not be the bad guy that exploits the vulnerabilities for personal gain. For these reasons the students in IASC 1100 are asked to sign an ethics statement in the very first class and are warned that any violation will meet with the strictest consequences. These include the student failing the class and the UNO IA faculty fully cooperating with law enforcement in prosecuting the violation, even if the student took the course in the past. The ethics statement reads as follows: In this study, one may learn or gain access to methods of bypassing computer security measures, malicious uses for computers, how to disrupt normal operations of computers or networks, and / or other illegal, immoral or unethical uses of computers and networks. It is important then, that one must realize the responsibility that will accompany such knowledge. The realization of such responsibility shall come from adherence to the laws and guidelines held by the university, state, country, and global computer user community. One will also be obligated to follow moral and ethical notions of honoring the privacy of others and their right to a secure

2012 September Vol. 3 No. 3 acm Inroads 55

com p re h e ns i v e a r t i c l e s
A Freshman Level Course on Information Assurance: Can It Be Done? Heres How
and courteous computing environment. No information obtained through this study should be directly or indirectly applied to any attack (unauthorized access, circumventing of security measures, affecting normal operation, destruction/copying of data, etc.) on unauthorized public, private, or commercial computers or networks. I have read and understood and agree to adhere to the STEAL usage policy. Realizing my responsibilities, I promise to adhere to the above ethics statement. In the event of my failure to fulfill this promise I accept the consequences of my actions. In addition to this measure, students are constantly reminded of responsible and ethical behavior throughout the course. Towards the end of the course we also conduct an active learning session on policy and ethics with students using case studies to discuss the right and wrong of particular actions.

It is one thing to offer a course and another to get students to actually sign-up for it. The fact of the matter is that if a course does not count towards your degree program, chances are that students are not going to take it. Course selection based purely on student curiosity in a topic may work for advanced electives in the junior and senior years but chances are slim at the freshman level of attracting students taking a course just for their own interest. We knew that if we wanted to make the course relevant for non-technology majors, it needed to somehow count towards their degree programs. With pressure from the University Administration to keep undergraduate programs under or around 120 credit hours, it is not easy to add cours-

EXPANDING INTO GENERAL EDUCATION

With pressure from the University Administration to keep undergraduate programs under or around 120 credit hours, it is not easy to add courses that do not seem directly relevant to a degree program.
4.5 Issues in Textbook Selection
While many good information security books exist, most of them are written either to address a very specific topic in IA or are written for a security professional that already has a background in the field. Since it was determined that the course would have a very broad coverage of topics, but at an entry level, and keeping in mind the economics of the typical college student, textbook selection was difficult. While it is possible to find good books at an entry level, to cover all the topics in IASC 1100, it would require the use of multiple textbooks, which would sometimes overlap in large portions of material. There is no one size fits all answer with regards to textbook selection. Additionally, needing to purchase multiple books would prohibitively add to the cost for students. In our assessment, the book that covers the majority of course topics, uses the majority, if not all, of the book (i.e. there are not large portions of the book that would be unused for the purpose of the course), and is very economical is the OReilly Computer Security Basics, Second Edition [11]. While this book does cover most of the subject areas covered in our course, it does still omit some subject areas that are then supplemented with magazine articles, lecture slides (in the case of networking basics) and/or other resources that address a particular topic. Sometimes, also, the textbook provides primarily a background for students not familiar with a topic, so that the instructor can then concentrate on a more advanced discussion of the topic without having to provide all the underlying details. Such is the case in topics like wireless security where the student can learn the basics from the book and the instructor can then focus on the best practices. es that do not seem directly relevant to a degree program. Upon inception, IASC 1100 was recognized as an elective or requirement only for the IA, IT Innovation and MIS degree programs in the College of Information Science and Technology. Fortunately, in spring 2010, a new approach to UNOs general education program was adopted that is student-centered, aligned with the student learning outcomes desired of all UNO graduates, and accessible. With the student learning outcomes for GenEd areas clearly defined, we successfully mapped the course content of IASC 1100 to the Global Diversity learning outcomes. In particular we satisfied the following student learning outcomes with respective arguments:  Recognize the environmental and historical circumstances that produce different social and cultural systems  Demonstrate specific knowledge of the cultural, historical, social, economic, and/or political aspects of one or more countries other than the United States  Explain the interrelations among global economic, political, environmental and social systems  Explain ways in which identity is developed and how it is transmitted within and by members of the group or groups. It was argued that a key aspect of this course is to understand the threat agent and their motives (deep-rooted in social, cultural, economic and political issues in the global human network) to launch an attack in cyberspace [9]. This course emphasizes our current dependence on information technology and how its security in cyberspace (or lack thereof ) is shaping the social, political, cultural

56 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

and economic landscape. The course examines several historical and contemporary events that have been shaped by the exploitation of information technology. Examples include the capture of the Enigma machine and related code breaking activities that significantly reduced the duration of World War II and losses for the Allied forces. Recent examples include the moving of the Soviet soldier bronze statue in Estonia and the following cyberattacks on their critical infrastructures. In April 2001 following the crash landing of a U.S. spy plane, Chinese hactivists defaced almost 1,000 U.S. websites and launched a distributed denial-of-service against technology infrastructure of The White House and the Central Intelligence Agency. Three hactivist groups: the Hacker Union of China, China Eagle and the Green Army Corps, were suspected of launching these bold, highly sophisticated cyberattacks. Wikileaks publishing of US diplomatic cables and its support by a group of hacktivists named Anonymous have significantly impacted our national security and international strategies. These events and the pervasiveness of information systems motivate the urgency of imparting cybersecurity education to a broad portion of the future workforce. An important part of the course is to gain at least a partial understanding of the historical nature of the internet and why the cybersecurity world has evolved into what it is today. The internet was developed in a culture where security and sensitivity of information were not considered, and privacy was not a concern since little personal data was computerized. Contrasts between the early days of the technology versus the pervasive use of networks today give an important cultural and historical perspective on modern cyber issues. Appendix A includes a detailed mapping of course content to Global Diversity topics. Nearly every educator faces this challenge: What needs to be left out of the course as new things get added, and how do we meet the many demands from a degree program? To meet the growing demand for cyber security professionals, many computing related degree programs have built specialized programs. However, a vast majority of IT users today do not have a formal training in computing disciplines or information security. Broad security awareness is absolutely critical as computing technologies continue to make an impact in the quality of our lives. To meet this high priority need, in our case a creative solution was to offer all university majors a choice for fulfilling the general education requirements for global diversity while still getting exposed to IA topics. Such creative solutions will be specific to programs where they are fielded. Regardless of the strategy, if we envision a secure cyberspace where the security posture of every connected computer plays an important role, then everybody should have the opportunity to receive basic IA education. Courses such as IASC 1100 will be instrumental in broadening the reach of secure computing knowledge to other noncomputing related disciplines. Cyber security has also become a priority for our nation. This is demonstrated in initiatives such as the National Initiative for Cybersecurity Education (NICE) framework1 that characterizes a broad range of knowledge, skills and abilities required from a cyber security workforce; large increases in the number of scholarships offered in return for federal service; and the increase in the number of degree programs for cyber security in higher education.
1

Student evaluation comments at the end of the course are a source of candid feedback and help identify places where we can improve. Here we provide a few comments that we have received over the past semesters. These comments were provided as part of the course evaluation sheets turned in anonymously by students in the last week of the semester. Written comments in the evaluation sheet are limited to two sections: 1) The best part of the course is and 2) The worst part of the course is We first list comments provided for The best part of the course is:  The labs were great to get hands on experience.  Learning about new technology.  Lots of good information.  Hands on lab activities to apply knowledge gained in class.  Interesting topic.  The instructor helped me a lot. He was willing to do it. However the class materials were so hard and I felt like I needed a lot of knowledge of computer and computer terms. I picked this class because I thought it was more basic materials but not really thats what I thought of this class.  The labs are really useful and in a field such as Information Assurance firsthand experience is often much better than just going through powerpoints and lectures. That being said the powerpoints and lectures were still good sources of material for learning the general principles of the subject.  The labs; labs were instructive and practical, providing the hands-on knowledge of the theory we learned in class.  Used guest speakers well. Implemented showing up to class to turn in assignments [Daily assignments].  Encourages group discussion.  The way it walked us through things that qualify as illegal, allowing us better hands-on training. The comments provided for The worst part of the course is include the following:  This course covers essential security topics that everyone at school today should know. There should be a less intensive version available as a short class to every student.  The lack of time for labs, but that wasnt really anything that could be changed for a class setting. Providing lab virtual images online or having content available to students on a flash drive or CDs would be a great addition to allow student to take the lab home and play with them more in-depth.  none.  not much to improve.  The only thing that I found bad about the course was trying to find a new site [security relevant] twice every week. The quality of the sites I was finding towards the end of the semester just werent very good, although I feel the daily assignment itself is a good idea.

STUDENT COMMENTS AND FEEDBACK

http://csrc.nist.gov/nice/framework/

2012 September Vol. 3 No. 3 acm Inroads 57

com p re h e ns i v e a r t i c l e s
A Freshman Level Course on Information Assurance: Can It Be Done? Heres How
 Certain lecture topics and labs actions assumed prior basic knowledge not specified earlier in the course or as prerequisite  A lot of work.

One area, which will receive additional work, involves the requirements for the GenEd status within our University. We are asked to include additional Global Diversity topics (see mapping of course topics to global diversity topics in Appendix A) and we are in the process of implementing these requirements. We anticipate that this can be covered in part by the News Notes section of the class. Also, guest speakers will be asked to address these needs. The need for more courses in IA at the freshman and sophomore level has been recognized due to the requests of IASC 1100 students asking how they can stay involved in IA ongoing and not have to wait until their junior year to take another IA centric course. As always the difficulty will be where to find the time for students to take yet one more course in an already very full degree program, but we will continue to work to provide more courses or

CONCLUSION AND FUTURE WORK

smaller modules at a lower level to meet the needs and desires of these students. We are currently identifying opportunities to inject secure coding topics in introductory programming classes. Our results for the IASC 1100 class thus far have been promising from several perspectives: attracting students from technology and non-technology majors, maintaining strong enrollment numbers, producing new recruiting channels and continuously improving course reviews. In Table 1, we summarize the student majors that have enrolled in the class for the past 6 semesters starting from the first offering of this course. The enrollments in the course have increased since 2009; however, most of the increase is in computing disciplines as shown in Figure 1. We have also witnessed a small number of students (~ 8 students, which is roughly 6% of all students enrolled in IA programs) who have either added IA as a minor to their current degree programs, or in some cases, have switched their major to IA based on their experiences in IASC 1100. While this number is currently small, increasing enrollments and diversity of majors in the course suggest that this number will increase. Our future work includes recruiting a more student diverse population, now that we can offer the opportunity to earn GenEd credits. Finally, the course has provided faculty the ability to get connected with IA majors early in their academic careers. This has led to getting to know the students in our IA program better and earlier on, and to involving students sooner in research so that they have found fulfillment doing interesting work. Examples of research

Table 1: Majors enrolled in IASC 1100 since its inception


Semester Major Information Assurance (IA) Management Information Systems Computer Science Computer Science and IA IT Innovation Computer Science and IT Innovation Bioinformatics Psychology Engineering Engineering and IA Business Administration College of Public Affairs and Community Service Arts and Sciences Criminal Justice Aviation Architecture Synchronous distance learning at UofNeb Kearny Undecided Total Enrolled Spring 2009 # of Students 7 5 3 0 0 1 1 1 1 0 0 0 0 0 0 0 0 2 21 Fall 2009 # of Students 10 2 2 1 0 0 0 0 0 1 2 1 1 0 0 0 0 1 21 Spring 2010 # of Students 7 2 5 0 5 0 0 0 0 0 0 0 0 0 0 0 0 2 21 Fall 2010 # of Students 18 4 3 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 28 Spring 2011 # of Students 7 1 2 3 5 0 0 0 1 0 0 0 1 0 0 0 1 0 21 Fall 2011 # of Students 18 4 10 3 3 0 0 0 0 0 1 0 0 1 1 1 0 3 45

58 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

Figure 1. Enrollment trend for computing and noncomputing majors in IASC 1100

projects with previous IASC 1100 students and faculty include:  Setup and maintenance of a control system honeypot. This honeypot simulates web-based systems used for controlling industrial processes  Investigation of SQL injection, XSS, and PDF file exploits  Development of a virus signature detection engine using two different algorithms for string matching We find gratitude in knowing that by increasing security awareness in students, even those who may never take another IA course again, we are helping to spread security knowledge and, hopefully, make cyberspace a safer place for us all and the job of our future IA professionals a little easier. Ir
References [1]  Computer Science Curriculum 2008: An Interim Revision of CS 2001 Curriculum Guidelines for Undergraduate Degree Programs in Computer Science, (2008) Interim Review Task Force from Association for Computing Machinery and IEEE Computer Society. [2]  Computing Curricula 2001 Computer Science, (2001), The Joint Task Force on Computing Curricula IEEE Computer Society and Association for Computing Machinery, (Ed.). Engel G., & Roberts, E. [3]  Computing Curricula, The Overview Report, (2005), The Joint Task Force on Computing Curricula 2005, The Association for Computing Machinery (ACM), The Association for Information Systems (AIS) and The Computer Society (IEEE-CS), (Ed.). Shackelford R. [4]  Curriculum Guidelines for Undergraduate Degree Programs in Computer Engineering, (2004), The Joint Task Force on Computing Curricula IEEE Computer Society and Association for Computing Machinery, (Ed.). Soldan D., Hughes E. A., Impagliazzo J., McGettrick A., Nelson V. P., Srimani P.K., Theys M.D. [5]  Curriculum Guidelines for Undergraduate Degree Programs in Information Systems, (2010), Joint IS 2010 Curriculum Task Force, Association for Computing Machinery and Association for Information Systems. [6]  Curriculum Guidelines for Undergraduate Degree Programs in Information Technology, (2008), Association for Computing Machinery and IEEE Computer Society. [7]  Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering, (2004) The Joint Task Force on Computing Curricula IEEE Computer Society and Association for Computing Machinery, August 2004. [8]  Federal Cyber Service: Scholarship for Service (SfS), (2011), Retrieved December 13, 2011, from https://www.sfs.opm.gov/ [9]  Gandhi, R.A., Sharma, A., Mahoney, W., Sousan, W., Zhu, Q., Laplante, P., (2011) "Dimensions of Cyber-Attacks: Cultural, Social, Economic, and Political," IEEE Technology and Society Magazine, vol.30, no.1, pp.28-38, Spring 2011 [10]  Gaudin, S., College Hacking Course Kindles Fiery Debate, eSecurity Planet, (2003), Retrieved December 13, 2011, from http://www.esecurityplanet.com/trends/article. php/2217781/College-Hacking-Course-Kindles-Fiery-Debate.htm [11]  Lehtinen, R, Gangemi Sr. G.T., Computer Security Basics, O'Reilly Media; Second Edition, ISBN-13: 978-0596006693, June 20, 2006.

[12]  Lemos, R., Teaching hacking helps students, professors say, The Register, (2007), Retrieved December 13, 2011, from http://www.theregister.co.uk/2007/08/07/teaching_students_hacking/ [13]  Logan, P. Y., Clarkson, A., Teaching students to hack: curriculum issues in information security, SIGCSE '05 Proceedings of the 36th SIGCSE technical symposium on Computer science education, ACM, New York, 2005. [14]  Nmap security scanner, (2011), Retrieved December 13, 2011, from http://nmap.org/ [15]  NSF Research Experience for Undergraduates (REU) in Computer Science and Information Technology, (2011), Retrieved December 13, 2011, from http://reu.ist.unomaha.edu/ [16]  Paros-for web application security assessment, http://www.parosproxy.org/ [17]  Pashel, B. A., Teaching Students to Hack: Ethical Implications in Teaching Students to Hack at the University Level, InfoSecCD Conference06, September 22-23, 2006, Kennesaw, GA, ACM, 2006. [18]  The Walter Scott, Jr. Scholarship, (2011), Retrieved December 13, 2011, from http://pki. nebraska.edu/new/admissions/scholarships-financial-aid.php [19]  Wireshark network protocol analyzer, (2011), Retrieved December 13, 2011, from http:// www.wireshark.org

Robin Gandhi
University of Nebraska at Omaha College of IS&T, 6001 Dodge Street, PKI 177A Omaha, Nebraska 68182-0116 USA +1-402-554-3363
rgandhi@unomaha.edu

Connie Jones
University of Nebraska at Omaha College of IS&T, 6001 Dodge Street, PKI 283A Omaha, Nebraska 68182-0116 USA +1-402-554-3889
conniejones@unomaha.edu

William Mahoney
University of Nebraska at Omaha College of IS&T, 6001 Dodge Street, PKI 282F Omaha, Nebraska 68182-0116 USA +1-402-554-3975
wmahoney@unomaha.edu

Categories and Subject Descriptors: K.3.2 [Computing Millieux] Computers and Education; Computer and Information Science Education General Terms: Security, Legal Aspects. Keywords: Information Assurance, Global Diversity, Computer Security, University Education

DOI: 10.1145/2339055.2339072

2012 ACM 2153-2184/12/09 $15.00

2012 September Vol. 3 No. 3 acm Inroads 59

com p re h e ns i v e a r t i c l e s
A Freshman Level Course on Information Assurance: Can It Be Done? Heres How

Appendix A
IASC 1100 Course Outline UNIVERSITY OF NEBRASKA AT OMAHA COURSE SYLLABUS/DESCRIPTION Department and Course Number:  IASC 1100 Course Title:  Introduction to Information Security Total Credits: 3  Date of Last Revision:  Feb 22, 2011 requirements for their major. It will provide a basic background into networking and insight into the field of Information Security for students who may be undecided in their major or want to gain some basic knowledge of the field. This course will offer an Honors Contract that will include preparation for passing the A+ certification test. 1.3 P  rerequisites of the course (Courses). None 1.4 P  rerequisites of the course (Topics). None 1.5 U  nusual circumstances of the course. None

1.0  Course Description:


1.1 O  verview of content and purpose of the course (Catalog description). This course emphasizes our current dependence on information technology and how its security in cyberspace (or lack thereof ) is shaping the global landscape. Several historical and contemporary global events that have been influenced by the exploitation of information technology motivates topics on cyber crime, malware, intrusion detection, cryptography, among others, and how to secure one's own data and computer system. Several aspects of this course are geared towards developing an understanding of the cyberspace as a new medium that breaks all geographical boundaries, while highlighting noticeable influences on it from social, political, economic and cultural factors of a geographical region. 1.2 For whom course is intended. This course is intended for freshman Information Assurance (IA) majors who want to get an overview of the field, freshman or sophomore College of IS&T students who want to know more about IA, and non-College of IS&T students needing to fulfill 3 credit hour course

2.0 O  bjectives:
List of performance objectives stated in terms of the student educational outcomes. 2.1  To better understand the aspects of Information Security. 2.2 U  nderstand ethical and legal aspects of information security 2.3 U  nderstand the social, political, cultural and economic impact of information technology and the pressing need for its security in cyberspace. 2.4 A  nalyze cases in cyber warfare spanning diverse cultures and multinational issues 2.5 A  nalyze vulnerabilities in hardware and software 2.6 H  istory of cryptography and its applications 2.7  To learn about basic common network concepts and security issues. 2.8  To learn how to better protect one's own data and computer systems.

3.0 C  ontent and Organization:


List of major topics to be covered in chronological sequence.

Course Topics 3.1: Introduction to Information Security (1 hour) 3.1.1: What Is Information Security? 3.1.2: Why is Information Security relevant? 3.1.3: History of Information Security 3.1.4: Foundational Concepts 3.2: Security Concepts (5 hours) 3.2.1: Basic Threat Model 3.2.2: Confidentiality and Privacy 3.2.3: Integrity 3.2.4: Availability 3.2.5: Access Control 3.2.6: Biometrics 3.2.7: Assurance, Law and Ethics 3.3: Vulnerabilities (10 hours) 3.3.1: Physical Security 3.3.2: Software Design Flaws 3.3.3: Social Engineering on social networks 3.3.4: Passwords 3.3.5: Malware 3.3.6: Vulnerability Discovery 3.3.7: Phone Phreaking 3.4: Network Security Basics (10 hours) 3.4.1: Protocol Stack 3.4.2: DNS 3.4.3: HTTP 3.4.4: E-mail 3.4.5: Server Client Relationship 3.4.6: Protocol Encapsulation (NAT) 3.4.7: IP Address Interpretation 3.4.8: Binary and Hexadecimal Number Systems 3.4.9: Local Host Tables 3.4.10: LANs 3.4.11: Network Threats 3.4.12: ARP, DNS and TCP attacks 3.5: Wireless Security (1 hour) 3.8.1: Mechanics of WIFI 3.8.2: Hardening Access Points 3.8.3: Eavesdropping Defenses

Relevant Diversity Topics  3.1.3: Discuss key historical (social, political, economic, and cultural) events globally that shape current information security needs

 3.2.1: Cultural and economic differences in different countries that lead to cybercrime and distinct hacker characteristics. E.g. correlation between math proficiency and computer hacking skills in countries, educational and cultural backgrounds.  3.2.7: Privacy and security regulations in the US compared with other countries

 3.3.3: Social acceptance of internet mediated communications and the misuse of deep-rooted social trust in developed countries using social engineering attempts like phishing.  3.3.5: Malware infection rates in different countries and their relation to economic and socio-cultural issues.  3.3.6; 3.3.7: Ethics of reporting vulnerabilities and discovering them for research.  3.4: International collaborations for Internet protocols and standards-based communications. Formation of the Internet and the assumption of trust among the participants.

60 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

Course Topics 3.6: Cyber War, Crime and Digital Forensics (2 hours) 3.5.1: Case study of cyber attacks and cyber wars 3.5.2: Comparing cyber warfare capabilities from different nations 3.5.3: Intrusion Detection 3.5.4: Gathering Evidence 3.5.5: Recovery 3.7: Ethics and Legal Controls (2 hours) 3.6.1: Government and Business Oversight 3.6.2: Hacking for Good 3.7: Cryptography (5 hours) 3.7.1: History and Background 3.7.2: DES, 3-DES, AES 3.7.3: PKI 3.7.4: Authentication and Integrity 3.7.5: PGP 3.8: Assurance and Risk Assessment (5 hours) 3.8.1: The need for assurance 3.8.2: Assurance throughout the lifecycle 3.8.3: Risk Components 3.8.4: Qualitative and Quantitative Risk Assessments 3.9: Policies and Procedures (5 hours) 3.9.1: CMS Model 3.9.2: Analyzing Costs and Risks 3.9.3: Disaster Plan 3.9.4: Administrative vs. Users 3.9.5: Backups 3.9.6: Security Audits

Relevant Diversity Topics  3.6: Discuss the social and cultural impact on governance and operations in cyberspace.  3.5.1: Case study of cyber attacks and cyber wars in different countries. US as a target of cyber warfare.  3.5.2: Comparing cyber warfare capabilities from different nations  3.5.4: Balancing intelligence needs with citizens right to privacy in US and Europe.  3.6.2: In-depth investigation of cyber attack cases for different social, political, cultural and economic causes. [Gandhi SPEC 2011] Reference at end of this table.  3.7.1: Examination of the history of cryptography in various cultures and its impact on the course of wars. E.g. the German Enigma machine and its impact on World War II.  3.7.2: Compare controls on the export of cryptography in US to other countries  3.8: Compare assurance mechanisms in Germany, Canada, Europe, and US and their amalgamation into the Common Criteria

 3.9: Consideration of social and cultural norms in defining security policies.  3.9.2: Acceptance and enforcement of security policies in a culturally diverse workforce.

4.0  Teaching Methodology:


4.1 M  ethods to be used. The course will be presented primarily in lecture form. However, students will be expected to participate in discussions of the various topics as they are studied. In addition to the study of the text, students must do homework as assigned and periodic laboratory exercises with write-ups of the exercise. Two tests will be given. A written paper with oral presentation as a semester project will be required. 4.2 Student role in the course. The students will be involved through exams, homework, projects, laboratory exercises and discussion with each other. 4.3 Contact hours. 3 hours per week.

5.0  Evaluation:
5.1 T  ype of student projects that will be the basis for evaluating student performance, specifying distinction between undergraduate and graduate, if applicable. For laboratory projects, specify the number of weeks spent on each project). Students will complete a research-oriented project in the form of a 5 10 page paper with a 10 minute power point presentation. 5.1.1 Research-oriented project The objective of a research-oriented project is to study and digest advanced technical literature, and report on it in a form that is easy to understand for other students in the class. Extensiveness, comprehensibility and technical worthiness are major considerations. 5.2 Basis for determining the final grade (Course requirements and grading standards) specifying distinction between undergraduate and graduate, if applicable. Two exams will be given during the course: 25%  Exam 1 25%  Exam 2 20% Semester project 10%  Laboratory Assignments 10%  Homework Assignments 10%  Daily Written Assignments Tentatively, exams are scheduled every seven weeks.

Daily Written Assignments Students are expected to bring to class each day, except on test days, a written paragraph summary, in your own words, of a current event article dealing with information security. There are many online sources for daily information security news such as: slashdot.org and www.securityfocus. com. These are only a couple of many possible sites and students should find others on your own by doing a search. These assignments will be discussed each day so students need to be prepared to present their assignment. Additionally, students must include one question with answer that would make a good test question, based upon the previous class period's lecture or activity. 5.3 G  rading scale and criteria. A+  97% - 100% A  93% - 96% A-  90% - 92% B+  87% - 89% B  83% - 86% B-  80% - 82% C+  77% - 79% C  73% - 76% Minimum final, passing grade for Engineering C-  70% - 72% Minimum final, passing grade for IS&T students D+  67% - 69% D  63% - 66% D-  60% - 62% F  00% - 59%

6.0 Resource Material


6.1 T  extbooks and/or other required readings used in course. Rick Lehtinen, Deborah Russell, and G.T. Gangemi Sr., Computer Security Basics (Second Edition), O'Reilly, 2006. 6.2 O  ther suggested reading materials, if any. 6.2.1 R  oss Anderson, Security Engineering, Wiley, 2001. 6.2.2 B  ruce Schneier, Beyond Fear, Copernicus Books, 2003. 6.3 O  ther sources of information. Research publications may be distributed and studied to better understand the topics in question.

2012 September Vol. 3 No. 3 acm Inroads 61

com p re h e ns i v e a r t i c l e s

CS1 with Games and an Emphasis on TDD and Unit Testing: Piling a Trend Upon a Trend
Ville Isomttnen Vesa Lappalainen

his article studies how CS1 students responded to two recent trends in programming education: TDDlike testing, which we applied to procedural programming tasks, and game contextualization. Our main

conclusions are: (1) to make students realize the importance of test writing, we need to design more programming tasks where TDD-like tests are relevant. (2) Merely working with a simple game programming library can inhibit the learning of basic programming concepts. (3) While the game development (course component) motivated students, they also reported that teacher support and well-functioning course arrangements are very important.

62 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

During the past five years, we have made two major changes to our introductory programming course in computer science (CS1). First, we have developed a tool with which we can introduce students to unit testing and test-driven development (TDD) [10]. Second, we have adopted a game development component. We have developed a programming library that allows beginner programmers to implement their own games [5]. We first used games in our K-12 outreach program, and subsequently introduced a similar contextualization to our CS1. Both of these changes follow the recent trends in programming education. It is not difficult to see that computer science (CS) education is influenced by trends in the rapidly developing field of technology. However, reflection on why we, as educators, emphasized one aspect yesterday and emphasize another today is missing. This is the motivation of the present paper. It was just a while ago that we embraced testing in our introductory programming course, and now we are occupied with game contextualization. We need to learn where we stand with regard to these shifts in focus. This paper is a discursive reflection on these changes, conducted by applying content analysis to students course feedback. Instead of presenting a controlled experimental study, the paper is a combination of content analysis and teacher reflection practiced over a period of several years.

INTRODUCTION

Unit testing and TDD have received a lot of educational attention as a preferred and integral part of programming. In several studies, TDD has been regarded as the most difficult XP practice to learn [8, 18, 21]. Many students believe in its benefits even if they would not use it voluntarily or after their educational training [17, 6]. Some authors have reported on the successful inclusion of TDD at CS1 level [7], while others report that it imposes too high a technical or cognitive load [16, 8]. Close to our present interests are those sections of the literature that discuss the role of TDD and testing in close relation to teaching and learning. Janzen and Saiedian [7] speak of test-driven learning (TDL), where writing automated tests is associated with teaching by example. Their motivation is to achieve improved programming conventions among their students, meaning improved code quality. Edwards [3] suggests that writing tests in a test-first manner could change beginner programmers habit of trial-anderror into a habit of reflection-in-action, which would improve their understanding of program behavior. Wellington et al. [22] report that blind testing, where students assignments are evaluated by tests that are not available to the students, does not support

Related Work

learning properly. Rather, the students need to be able to review ready-made tests, preferably writing their own tests, to benefit from TDD in problem solving. They also point out that designing tests first was a particularly suitable strategy for weaker students, as tests tend to be simpler to start when compared to the desired functionalities. Overall, the claims advanced for the educational benefits of TDD and unit testing in the CS1 context appear to amount to students improved understanding of program behavior. A current CS1 trend, perhaps a more topical one than TDD and unit testing, is game contextualization. Studies where games have had one role or another in CS1 have consistently reported positive experiences. Games can attract both males and females [12, 15], and students become confident in their own learning abilities with games [12]. Games appear to have a good fit with object-oriented programming, leading to a natural analysis under the OO approach [11]. Graphical games tend to match well with the constructivist view of learning, as the visual experimentation (that is involved) is likely to help in developing internal models of programming concepts [15]. Games provide meaningful study content as students can share their games within their own social network [19]. Some experiences indicate that the use of games has clearly improved retention numbers without sacrificing any technical depth of programming [12]. Improved retention does not always emerge [1], while improved student performance is quite often reported. The paper by Rajajavivarma [19] suggests some reasons for this improved performance. That is, with games students develop tenacity in their learning efforts, and consequently can overcome many known beginner challenges. For example, students unaffectedly focus on design questions and keep testing their products until they work properly. The complexity of implementing attractive games is regarded as the key challenge of game development in introductory programming courses [4]. However, the literature indicates that this challenge has been mitigated by tooling the game development properly [12, 5], by letting the students start with a pre-programmed skeleton [14, 15], and by emphasizing games that are of tolerable size [9]. A games-related introductory course, of course, may necessitate curricular changes. For example, Leska and Rabung [11] point out that some of their usual CS1 content was not covered, including file I/O, exception handling, and try-catch blocks. We found no focused discussions with a joint emphasis on testing and game contextualization in the CS1 context.

Our CS1 is a six-credit course, which is equal to 160 hours of work. Lectures take up 48 hours, leaving the remaining time for programming exercises and a course assignment. The objective of the course is to learn procedural programming, which principally means learning to write a program using conditional and looping clauses, and methods as procedures. The current version of the course uses

The Course

2012 September Vol. 3 No. 3 acm Inroads 63

com p re h e ns i v e a r t i c l e s
CS1 with Games and an Emphasis on TDD and Unit Testing: Piling a Trend Upon a Trend
the C# programming language and is therefore inevitably involved with .NET Framework objects and our game programming library objects. However, while students use objects they are not particularly expected to design and implement their own objects. The course ends with an exam, a sufficient number of completed tasks during each week, and a course assignment, which at present is a game programmed by a student. Each of the weekly exercises consist of six basic-level programming tasks, and, additionally, of bonus-level (may require extra studying) and guru-level (impossible without extra studying) tasks. By writing automated tests for any of these tasks, students are rewarded with extra points. Students can achieve the best grade without writing any tests. Ready-made tests are occasionally included as guidance at all task levels. TDD-test writing is not included in the principal learning goals of the facultys CS1 teaching. We introduced it as a potentially good learning opportunity and wanted to study how the students responded to it at the introductory level. Automated tests are written with the ComTest tool, where, using a very simple syntax, a programmer writes tests directly into the C# comments. The Java implementation of the tool is presented in [10]. The lectures are based on authentic programming and the lecturer writes tests in a test-first manner throughout the course. The course assignment in the form of game development is based on the Jypeli programming library. This event-driven library is built on the Microsoft XNA Framework and utilizes a Physics2D.Net library. Similar to ComTest, it allows programmers to achieve a great deal with a very simple syntax: a simple game can be programmed without loops and if-sentences. The Jypeli library appears in [5]. Our CS1 instances during 2008-2010, which we also refer to in this paper, used Java. In 2008 and 2009, we included some graphical programming. In 2010, we included more media-related exercises in the form of simple image and music manipulation. Finally, in 2011, the course was heavily based on game development and used C# with the Jypeli library. scope of the present paper. Instead, we focus on the aspects that the students raised when they were asked about the pros and cons of the course, which we studied with open-ended questions. Secondly, we focus on the students opinions specifically on test writing and the game development component, which we studied with both a Likert scale and open-ended questions. Answering the survey was not compulsory. The numbers of respondents vary with the different questions and are given along with the results. We refer to this data as the course feedback survey. We additionally refer to a survey by which we have examined what aspects in general students relate to programming both before CS1 (pre) and after CS1 (post). We should note that this survey does not relate to the course feedback survey above. We have collected this research data during the past three years, and the same survey has also been administered to more advanced students. In this paper, we refer to our autumn 2010 CS1 and autumn 2011 CS1, using the terms 2010 pre-survey, 2010 post-survey, 2011 pre-survey, and 2011 post-survey. The same teacher ran these two courses. The first included the emphasis on unit testing and TDD, and the latter added the game development component. Table 1 presents the survey questions and the numbers of respondents during the two course instances. We refer only to those sections of this data that can shed light on our emphasis on testing and game development. A full qualitative analysis of these pre- and post-survey data will be published as a separate study. The authors crosschecked all interpretations of the themes found in the qualitative data.
Table 1: Pre- and Post-survey Details
Questions What is programming? What is important in programming? What skills are needed in programming? What aspects of programming do you like...dislike? If you have previously taken programming courses, how has your view of programming been changed by these courses? What programming languages do you know...prefer? Describe your previous experience in programming. Time 2010 pre-survey 2010 post-survey 2011 pre-survey 2011 post-survey Respondents 89 58 155 132

In 2011, the course was heavily based on game development and used C# with the Jypeli library.

This section discusses our current CS1 based on a content analysis applied to the students responses. The data we most often refer to is the course feedback data collected by the local student organization at the end of our game-themed course instance (2011). This survey included several general questions about course arrangements (lectures, lab sessions, assignments, etc.) that were not within the

Results

4.1 The Emphasis on TDD and Unit Level Testing


Table 2 displays the students (course feedback survey) responses regarding their own test writing activity. First, the students responses varied widely.

64 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

Table 2: Course Feedback Survey: I wrote ComTest tests... (1 = totally disagree, 5 = totally agree, N=81)
Likert score N 1 26 2 9 3 8 4 16 5 22

Altogether 55 out of the 81 respondents (68%) had written tests (Likert scores 25). Of the 55 respondents who had written tests, 38 (69%) had written many tests (Likert scores 4 and 5). Our analysis of the respondents open-ended comments indicated that 35 out of 55 (64%) students had found that writing tests helped them to solve problems or made them think more broadly and carefully about the code they were writing: [Student A]  I solved the task on several occasions with the help of ComTest. A few times I encountered issues [using ComTest] that I would otherwise have missed. [Student B]  The tests helped me to improve my understanding of the code. Although I had written working code, I had not fully understood it. The themes we identify in the open-ended answers that relate to the Likert scale score 1 are displayed in Table 3. The main complaint reported by the students is that they find test writing to be useless (of little or no value). These students have tried to adopt test writing, but have found it to add too much extra content or to be pointless relative to the work it requires. Experiencing difficulties with the installation and deployment of a testing tool can also become a barrier to test writing; initial difficulties of this kind have led students to ignore test writing for the remainder of the course. Yadin [23] reported similar observations; he found that stability issues with a visualization tool made some students stop using the tool, even at the cost of a lower grade.
Table 3: Course feedback survey: themes indicated by the students who did not really end up writing tests
Uselessness 11 Installation & deployment 4 Laziness 1 No comments 10

appears that students have confused the role of automated tests with that of debugging.  Some students regarded ready-made tests as more helpful than test writing. Ready-made tests already assisted the students in thinking of the programming tasks broadly and did not frustrate them, as test writing with small-size tasks tended to do.  Technical issues with tools should be avoided to be able to engage beginner students. A central question raised by these challenges is whether students consider test writing relevant to the content of CS1. We need to examine more closely what kinds of programming tasks are particularly useful for test writing, and thus ensure that content is relevant for the students. This, we assume, could help more students to overcome the learning curve. Rather than requiring continuous test writing, we should find a good balance between ready-made tests and tests the students should write themselves. While reinforcing a continuous test writing routine could be an important educational goal for CS1, it is also important to engage beginners with relevant content. In the future, we will scaffold test writing with tasks where students complement a ready-made test and then write the required functionality. We also need to communicate the role of automated tests more clearly in relation to debugging. As noted in the study by Sanders [20], beginner students may have difficulties with test writing when they do not yet know what they can do by programming. This might explain why some of our students started to refer to debugging when they were asked about test writing. However, as many student comments indicate that writing tests has contributed to their understanding of the problem and/or code, we conclude that test writing can be helpful for beginners if its role is clearly communicated in relation to other tasks in programming. We also examined whether the students associated testing with programming when they were generally asked what aspects they relate to programming (see Table 1). Given our course arrangements, where the teacher continuously writes tests during the lectures and test writing is encouraged with bonus points in the weekly exercises, the students end-of-course conceptions of programming include testing to some degree. While none of the students explicitly referred to testing in the pre-survey 2010, 10/58 (17.9%) mentioned it in the post-survey the same year. During the 2011 offering, these numbers were 5 and 22/132 (16.7%), respectively. In 2011, the game development component was included. It seems, surprisingly, that the presence of game development has not really masked our emphasis on testing.

Overall, the pedagogical challenges that emerge from the students open-ended comments, across the whole Likert scale, are the following:  There appears to be a learning curve in test writing, at the end of which students are able to appreciate it more. Those who wrote many tests reported test writing to be beneficial. Of the 35 respondents who referred to the benefits of test writing, 33 were in the group of 38 respondents who wrote many tests.  Students perceived test writing to be irrelevant due to the small size of the programming tasks. This activity was a central theme not only among those who did not really write tests, but also among those (four students) who wrote many tests but did not really regard test writing as helpful.  Students prefer debugging or just running the program to writing tests. This relates to the previous item, but it also

4.2 The Game Development Component 4.2.1 Should We Use Games? Based on the 2011 presurvey, we found that many students want to create by means of programming. These students spoke of the unlimited possibilities they see in programming, and of the possibility of creating something new with their own hands. Altogether, this theme was

2012 September Vol. 3 No. 3 acm Inroads 65

com p re h e ns i v e a r t i c l e s
CS1 with Games and an Emphasis on TDD and Unit Testing: Piling a Trend Upon a Trend
present in 61 (39%) of the 155 respondents answers to the question on what aspects of programming they like. Unsurprisingly, in light of this desire to create, we found approving comments on the game development component in the course feedback survey. Using a five-point Likert scale, the students were asked if they would have preferred a more traditional course with no game development. The numbers, displayed in Table 4, suggest that game development can match the motivations of todays CS students.
Table 4: Course feedback survey: If I could choose now, I would rather take a traditional CS1 with Java, instead of this Jypeli-based course,
Likert score N 1 28 2 18 3 26 4 6 5 3

The students opinions on the game development were also solicited with the open-ended item Feel free to comment on the game theme of the course. Analysis of the students positive responses showed that game development was regarded as topical, joyful, inspiring, motivating, and engaging (in the sense that the students tended to spend a lot of extra time on their course work), effective (in the sense that programming concepts became concrete to students), approachable (in the sense that games are known by the students). It was also enabling (in the sense that game development gives a context for advanced students to continue beyond course topics), and empowering (in the sense that the students notice they can create and complete something real and possibly useful). However, the game development component was not welcomed unreservedly. As shown in Table 4, the second biggest group, 26/81 (32%), chose the midpoint on the Likert scale. We focus on the challenges presented by the game contextualization of the course, as these emerged from the students answers to the open-ended

question above, in Section 4.2.2. Figure 1 illustrates the students CS1 performance across years 2008-2011. We find that the number of students who have returned weekly exercises has increased annually. As described in Section 3, in 2008 and 2009, the course was implemented in Java with some graphical programming. In 2010, media-related exercises were increased using simple image and music manipulation. In 2011, the programming language was C# and the course was based on game development. By comparing the curves in Figure 1 with the students positive responses on the game development component reported above (engaging, effective, etc.), we surmise that game development (or graphical programming with real-world relevance) has decreased dropouts. Based on the students positive feedback and the performance shown in Figure 1, we conclude that the inclusion of the game development component has improved our course.

4.2.2 Challenges presented by the game development component In the course feedback survey, we found no strong
critique in the students answers to the open-ended question about game development. The game theme of the course was consistently accepted as a student-friendly and topical teaching resource. However, quite a few students reported experiencing a conflict between games and their personal interests, or questioned the value of game development as a learning resource. Altogether 20 students out of 62 respondents (32%) were critical of the inclusion of games in the course1. This is wholly consistent with the students Likert scale answers above: if we add those who gave 5 and 4, and add half of those who gave 3, we obtain 22/81 or
1

The remaining answers (68%) were positive throughout, as described in Section 4.2.1.

Figure 1: CS1 students performance across years 20082011

66 acm Inroads 2012 September Vol. 3 No. 3

comprehensive articles

27%. The major criticism (16 occurrences in the 20 answers) was that game development using the Jypeli library masks the learning of basic programming concepts. Students seem to see a connection between lectures and weekly exercises, but when it comes to working on the course assignment, using the specific (game) programming library, they fail to see a strong connection with the topics they have been lectured on. They are frustrated when they encounter library-related example code as they are unable to comprehend all of it, or see the limits or the possibilities of the library, and hence complain about spending time on figuring out the library. They say they would rather spend this time on deepening their understanding of basic programming concepts. We conclude from this that tooling game programming in the context of teaching CS1 may induce program comprehension-

1.  Game development as a motivating factor (6/57, 11%) 2.  F lexibility: the students appreciated the possibility of

distance learning using videos of the lectures. (9/57, 16%) 3.  Support: the students appreciated the help and attention shown by the teaching staff. (19/57, 33%) 4.  Well-prepared course arrangements: the students appreciated the well-prepared materials, exercise sessions, etc. (17/57, 30%) 5.  Illustrative and explicit teaching and scaffolding: the lecturer is dedicated to teaching; the grading of the difficulty of exercises is sufficiently gradual (20/57, 35%) 6.  Practicality: the students preferred learning programming by programming (2/57, 4%) 7.  A short very positive comment (7/57, 12%)

By analyzing what the students reported when they were generally asked about pros and cons in the course feedback survey, we noticed the game development component was not the only major incentive for the students.
related issues. While the use of a programming library can teach students to tolerate uncertainty, and give them routine with documented interfaces, it may mask some aspects of program behavior, impeding beginner programmers learning. The teacher also observed this masking effect during the course. Interestingly, in our K-12 program the Jypeli library has clearly been an enabling element for the students and the whole course concept [5]. It must be a more serious need to learn programming concepts that led to the frustration with figuring out the library reported by some of the CS1 students. The second, a much more rarely occurring theme (3 occurrences in the 20 answers), was experiencing the game component as a toy. These students would have preferred a more serious subject that they could associate with working life or with their background in science studies. The third theme (2 occurrences) concerned the students preference for particular technical environment. One student enjoyed the technical environment used in the course (C# and Visual Studio) and gave this as the reason for liking the game development component. The other did not like the game development task because the Jypeli library required the Windows environment. Furthermore, by analyzing what the students reported when they were generally asked about pros and cons in the course feedback survey, we noticed the game development component was not the only major incentive for the students. Amongst the 57 students who answered the pros question, we found only 6 references to the game contextualization of the course. The themes we extracted from the students answers are given below. Note that many of the 57 students indicated more than one theme. We interpret the items 25 to indicate that the students felt a genuine effort had been made towards them. We assume that the vulnerability of young adults who are making the transition to university life [2] is likely to explain the students emphasis on attention and support (item 3) and the well-functioning teaching and course arrangements (items 4 and 5). The necessity of having a genuine psychological connection between a fosterer and a child is a well-known issue in developmental psychology. Interestingly, the themes we raise here, in particular attention and support, indicate a similar aspect in the present case of adult education. A well-functioning course is likely to mean clarity and approachability to confused freshmen adapting to university life. In pondering student engagement, Leutenegger and Edgington [12] consider course and example content to be more primary questions than the selection of a programming paradigm. Based on our data, we conclude that support, attention, and functionality may also play a very important role in designing and teaching a course. The direct questions on the game component were in the last section of the electronic course feedback survey where questions are answered progressively, meaning that answering the game-related questions should not explain the emphasis on rather different aspects described above. The students answers to the cons question revealed a single dominant theme: workload. However, taking into account the otherwise positive feedback, we consider that (by and large) the students accept the course workload as an inevitable part of learning programming.

2012 September Vol. 3 No. 3 acm Inroads 67

com p re h e ns i v e a r t i c l e s
CS1 with Games and an Emphasis on TDD and Unit Testing: Piling a Trend Upon a Trend

We have discussed student responses to a CS1 course that has incorporated two recent trends in programming education: game contextualization and an emphasis on TDD-like testing. First, our findings indicate we need to design programming tasks relevant to test writing. In one sense, we need to increase real-life relevance. Second, we need to make the basic concepts of programming more explicit to students in the presence of the attractive game development component, in particular in using the programming library that alone necessitates active learning on the part of the students. In one sense, we need to monitor what drawbacks emerge from the inclusion of real-life relevance. In sum, we would say that while real-life relevance helps to engage beginner students, its side effects must be monitored and managed. Interestingly, we noticed that the game development component was not the only major incentive for the students. We found that the students experience of support and attention, and of a well-functioning course dominated their overall course feedback. It might be that as we develop our courses we become active teacheras-researchers, and that this may have a (let as hope positive) effect on students responses and performance; however, it is not really monitored in our concerns over dropouts and retention. In fact, it should be no surprise that to prevent drop out problems many actions need to be taken. We emphasize this by referring to Lewins [13] conclusions on studying social problems. He had studied intergroup relations and concluded: An attempt to improve intergroup relations has to face a wide variety of tasks. ....We are beginning to see that it is hopeless [our emphasis] to attack any one of these aspects of intergroup relations without considering the others. Accordingly, based on our findings, we will seek ways to make the support structures of the course more explicit to students, on top of the usual considerations regarding paradigms, languages, tools, and content. Ir
References [1]  J. D. Bayliss. The effects of games in CS1-3. Journal of Game Development, 2(2):717, 2007. [2]  R. Dyson and K. Renk. Freshmen adaptation to university life: Depressive symptoms, stress, and coping. Journal of Clinical Psychology, 62(10):12311244, 2006. [3]  S. H. Edwards. Using software testing to move students from trial-and-error to reflectionin-action. In Proceedings of the 35th SIGCSE Technical Symposium on Computer Science Education, SIGCSE 04, pages 2630, New York, NY, 2004. ACM. [4]  R. Giguette. Pre-Games: Games designed to introduce CS1 and CS2 programming assignments. In Proceedings of the 34th SIGCSE technical symposium on Computer science education, SIGCSE 03, pages 288292, New York, NY, 2003. ACM. [5]  V. Isomttnen, A.-J. Lakanen, and V. Lappalainen. K-12 game programming course concept using textual programming. In Proceedings of the 42nd ACM Technical Symposium on Computer Science Education, SIGCSE 11, pages 459464, New York, NY, 2011. ACM. [6]  D. Janzen and H. Saiedian. A leveled examination of test-driven development acceptance. In Software Engineering, 2007. ICSE 2007, 29th International Conference, pages 719722, May 2007. [7]  D. S. Janzen and H. Saiedian. Test-driven learning: Intrinsic integration of testing into the CS/SE curriculum. In SIGCSE 06: Proceedings of the 37th SIGCSE Technical Symposium on Computer Science Education, pages 254258, New York, NY, 2006. ACM. [8]  K. Keefe, J. Sheard, and M. Dick. Adopting XP practices for teaching object oriented programming. In ACE 06: Proceedings of the 8th Australasian Conference on Computing Education,

Conclusions

pages 91100, Darlinghurst, Australia, 2006. Australian Computer Society, Inc. [9]  S. Kurkovsky. Engaging students through mobile game development. SIGCSE Bulletin, 41:4448, March 2009. [10]  V. Lappalainen, J. Itkonen, V. Isomttnen, and S. Kollanus. Comtest: A tool to impart TDD and unit testing to introductory level programming. In ITiCSE 10: Proceedings of the Fifteenth Annual Conference on Innovation and Technology in Computer Science Education, pages 6367, New York, NY, 2010. ACM. [11]  C. Leska and J. Rabung. Learning O-O concepts in CS I using game projects. In Proceedings of the 9th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education, ITiCSE 04, pages 237237, New York, NY, 2004. ACM. [12]  S. Leutenegger and J. Edgington. A games first approach to teaching introductory programming. In Proceedings of the 38th SIGCSE Technical Symposium on Computer Science Education, SIGCSE 07, pages 115118, New York, NY, 2007. ACM. [13]  K. Lewin. Action research and minority problems. Journal of Social Issues, 2(4):3446, Nov. 1946. [14]  T. Lorenzen and W. Heilman. CS1 and CS2: Write computer games in Java! SIGCSE Bulletin, 34:99100, December 2002. [15]  A. Luxton-Reilly and P. Denny. A simple framework for interactive games in CS1. SIGCSE Bulletin., 41:216220, March 2009.

We found that the students experience of support and attention, and of a well-functioning course dominated their overall course feedback.
[16]  W. Marrero and A. Settle. Testing first: Emphasizing testing in early programming courses. In ITiCSE 05: Proceedings of the 10th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education, pages 48, New York, NY, 2005. ACM. [17]  G. Melnik and F. Maurer. A cross-program investigation of students perceptions of agile methods. In ICSE 05: Proceedings of the 27th International Conference on Software Engineering, pages 481488, New York, NY, 2005. ACM. [18]  M. M. Mller, J. Link, R. Sand, and G. Malpohl. Extreme programming in curriculum: Experiences from academia and industry. In Extreme Programming and Agile Processes in Software Engineering, volume 3092 of Lecture Notes in Computer Science, Berlin Heidelberg, 2004. Springer. [19]  R. Rajaravivarma. A games-based approach for teaching the introductory programming course. SIGCSE Bulletin, 37:98102, December 2005. [20]  D. Sanders. Extreme programming: The student view. Computer Science Education, 12(3):235250, 2002. [21]  C. Wellington. Managing a project course using extreme programming. In Frontiers in Education, 2005. FIE 05. Proceedings 35th Annual Conference, pages T3G1, Oct. 2005. [22]  C. A. Wellington, T. H. Briggs, and C. D. Girard. Experiences using automated tests and test driven development in computer science I. In AGILE 07: Proceedings of the AGILE 2007, pages 106112, Washington, DC, 2007. IEEE Computer Society. [23]  A. Yadin. Reducing the dropout rate in an introductory programming course. ACM Inroads, 2(4):7176, 2011.

Ville Isomttnen and Vesa Lappalainen


Department of Mathematical Information Technology University of Jyvskyl FI-40014, Finland
ville.isomottonen@jyu.fi, vesal@jyu.fi

Categories and Subject Descriptors: K.3.2 [Computers and education]: Computers and Information Science Education - computer science Education General terms: Experimentation, Human Factors Keywords: CS1, game programming, TDD

DOI: 10.1145/2339055.2339073

2012 ACM 2153-2184/12/09 $15.00

68 acm Inroads 2012 September Vol. 3 No. 3

Advancing Computing as a Science & Profession

membership application & digital library order form


Priority Code: AD13

You can join ACM in several easy ways:


Online
http://www.acm.org/join

Phone
+1-800-342-6626 (US & Canada) +1-212-626-0500 (Global)

Fax
+1-212-944-1318

Or, complete this application and return with payment via postal mail
Special rates for residents of developing countries: http://www.acm.org/membership/L2-3/
Please print clearly

Special rates for members of sister societies: http://www.acm.org/membership/dues.html

Purposes of ACM
ACM is dedicated to: 1) advancing the art, science, engineering, and application of information technology 2) fostering the open interchange of information to serve both professionals and the public 3) promoting the highest professional and ethics standards I agree with the Purposes of ACM:

Name

Address

City

State/Province

Postal code/Zip

Country

E-mail address

Signature
Area code & Daytime phone Fax Member number, if applicable

ACM Code of Ethics: http://www.acm.org/about/code-of-ethics

choose one membership option:


PROFESSIONAL MEMBERSHIP:
o ACM Professional Membership: $99 USD $198 USD ($99 dues + $99 DL) o ACM Digital Library: $99 USD (must be an ACM member) o ACM Professional Membership plus the ACM Digital Library:

STUDENT MEMBERSHIP:
o ACM Student Membership: $19 USD o ACM Student Membership plus the ACM Digital Library: $42 USD o ACM Student Membership PLUS Print CACM Magazine: $42 USD o ACM Student Membership w/Digital Library PLUS Print CACM Magazine: $62 USD

All new ACM members will receive an ACM membership card. For more information, please visit us at www.acm.org Professional membership dues include $40 toward a subscription to Communications of the ACM. Student membership dues include $15 toward a subscription to XRDS. Member dues, subscriptions, and optional contributions are tax-deductible under certain circumstances. Please consult with your tax advisor.

payment:
Payment must accompany application. If paying by check or money order, make payable to ACM, Inc. in US dollars or foreign currency at current exchange rate.
o Visa/MasterCard o American Express o Check/money order $ ______________________ $ ______________________ $ ______________________ $ ______________________

o Professional Member Dues ($99 or $198) o ACM Digital Library ($99) o Student Member Dues ($19, $42, or $62) Total Amount Due

RETURN COMPLETED APPLICATION TO:


Association for Computing Machinery, Inc. General Post Office P.O. Box 30777 New York, NY 10087-0777 Questions? E-mail us at acmhelp@acm.org Or call +1-800-342-6626 to speak to a live representative

Card #

Expiration date

Satisfaction Guaranteed!

Signature

SIGcse Spotlight
H

Curt M. White
SIGCSE Trend Tracker

ello, and welcome to SIGCSE Spotlight, an ongoing column that highlights and reflects upon the current trends within computing education and the SIGCSE community. In this quarters issue we are going to (a) look at the custom textbooks that the publishers can create for computing professors, (b) hear from Rene McCauley about the importance of the upcoming SIGCSE elections, and (c) look into the recent SIGCSE listserv topic of the relationship between the amount of study time and our computer science students. At the recent SIGCSE Technical Symposium in Raleigh, I spoke with two different publishers (Pearson and Cengage) with respect to custom textbooks. I initially approached Pearson and asked them if they had a textbook I could use for a unique course that my department offers to our non-CS majors. The course involves databases, some statistics, a little probability, and some algorithms. To keep costs down, I didnt want students to purchase multiple books. Needless to say, they didnt have one; this is not surprising as the course is very unusual. They instead offered the idea of a custom textbook. To create this custom text, they could take chapters or even pieces of chapters from any of their books in their large holdings (recall that Pearson is now the parent company for Scott Foresman, Prentice Hall, AddisonWesley, Allyn and Bacon, Benjamin Cummings, and Longman). They could even accept chapters and/or materials that I had written. After all the chapters are brought together, they scrub the book. That is, they repaginate and format all the chapters (including the ones I had written) to give the book a uniform look; any references to Chapter X from the original book are altered accordingly. They can even create a table of contents and an index based on the new page numbers and content. The book can then be offered in hardcover, softcover, wire-bound, three-hole punch, or in an electronic form (for a number of platforms including iPad, Kno, Kindle, CafeScribe, inkling, and iBooks). I asked how the pricing was set for the iBooks version. In other words, would the book really be offered for $14.99 as the press releases had indicated? Unfortunately, the $14.99 price was more typical of a grade school textbook and not a college-level textbook. College-level textbooks in electronic form, due to their lower volume in sales, would continue to cost roughly one-half to two-thirds the cost of the paper version. After I spoke with Pearson I spoke with Cengage and was told they offered a similar service. I suspect most if not all of the major publishing houses are now doing the same. If you are looking for a textbook and not finding exactly what you want, you might want to consider a custom book. Now, lets hear from our SIGCSE chair, Rene McCauley.

u u u u u

S
Rene McCauley SIGCSE Chair 20102013

IGCSE is an all-volunteer organization that depends on its members to organize conferences, run chapters and committees, edit publications, and do so much more. One way to contribute to SIGCSE is through service on the board that oversees all aspects of the organization. The SIGCSE Executive Committee and Advisory Board (collectively referred to as the SIGCSE board) consist of the immediate past-chair and seven elected positions: chair, vice-chair, secretary, treasurer, and three at-large positions. Every three years, SIGCSE holds an election to fill these positions. The next election will be held in the spring of 2013. The SIGCSE bylaws (http://sigcse.org/ about/bylaws) detail the responsibilities of the board members and how elections are conducted.

curtain illustration : w w w.iStockphoto.com / K ateryna Potrokhova 

2012 September Vol. 3 No. 3 acm Inroads 71

SIGcse Spotlight
If you would like to work to make SIGCSE better, I encourage you to consider running for a position on the SIGCSE Board. I have found participation on the board to be extremely rewarding. I have found the time commitment to be reasonable. The board meets two times each year for face-to-face meetings and about four times a year by telephone. Terms last for three years. If youd like more information on the commitment required or if you think youd like to serve, please contact Barbara Boucher Owens (owensb@southwestern.edu), who is the chair of the nominating committee. Nominations are being solicited now, as a slate of candidates must be submitted to ACM by January 2013. Please consider running for election. SIGCSE needs you! In May of this year there was a flurry of activity on the SIGCSE listserv with regards to an article in the Washington Post (http://www.washingtonpost.com/local/education/number-of-hours-students-study/2012/05/21/gIQA3viTgU_ graphic.html?hpid=z3) stating that computer science majors (seniors) spend fewer hours studying than do many other majors. In particular, computer science majors ranked 17th. Architecture ranked first at 23.7 hours per week, followed by chemical engineering, physics, chemistry, art, nursing, and music. Computer science seniors apparently only spent 14.7 hours per week studying. Many of the following email comments posed the following questions:

 hat exactly was the study counting as studying? Homework? Writing code? Solving non-coding problems? W Reading? Working on projects? Is there a difference? Maybe once an algorithm is understood, one does not need further time studying it?   id the study include those students earning Ds and Fs? Some feel that students who dont do well in computer D science often continue on, while this is not likely in other subjects such as physics or chemistry. Maybe computer science students are a slightly different breed and studying time has different meanings?   s there really a direct correlation between the amount of time spent studying and ones performance in a class? I (Many seemed to think so.)  ven if our computer science majors spend less time than some of the other listed majors, does that make our CS E majors any less successful in the workplace?

Clearly this issue struck a chord. Finally, let me remind you that the upcoming ICER conference will be held September 10-12 at Auckland University of Technology in Auckland, New Zealand. Now is your chance to see beautiful New Zealand and attend a SIGCSE conference. There will be papers presented along with ICER Lightning Talks, as well as the Doctoral Consortium and a keynote speaker. For more information, visit the SIGCSE website.

u u u u u
As always, if you have any interesting/unusual/exciting/profound computing education-based ideas or results that you would like to share with the ACM community, send me an email at cwhite@cs.depaul.edu.
DOI: 10.1145/2339055.2339074 Copryright held by authors.

72 acm Inroads 2012 September Vol. 3 No.3

Undergraduate and Graduate Computing Students


Theres an ACM Student Research Competition at a SIG Conference of interest to you! Its hard to put the ACM Student Research Competition (SRC) experience into words, but well try
Participating in the SRC gave me the opportunity to attend a computing conference that focused on my area of interest. I found it very rewarding to discuss my work with a community Sarah Chasins that shares my research interests. Presenting my research at the premier high performance computing conference (SC11) helped boost my presentation skills, and provided an opportunity to practice presenting my work to a wide range of audiences. The feedback and award I received was also positively reinforcing.

Attention:

Swarthmore College SPLASH 2011 SRC

The SRC gave me the opportunity to present my research in important conferences in my field while I am still in the early stage of my Ph.D. With the advice and questions I got from the poster session, I kept improving my research. Xusheng Xiao 
North Carolina State University ICSE 2011

Texas A&M University SC 2011 SRC

Sam Ade Jacobs

SRC participation has made my work visible to the research community. Moreover winning SRC at SIGCSE with positive feedbacks has definitely brought confidence to help me Tanmoy Sarkar extend my work.
Iowa State University SIGSCE 2011

SRC was a great opportunity to get feedback on my research from the international audience. As an undergrad, it was equally valuable to meet professors and graduate students Sanjana Prasain working in my field of research.
University of Washington ASSETS 2011 SRC

Participating in the student research competition was a useful exercise in communicating the impact of my research to new audiences. It was also valuable to gain new, diverse perspectives to help frame a path for completing my Ph.D. Kevin Buffardi work.
Virginia Tech SIGCSE 2012 SRC

From the competition, I learned how to communicate my research ideas effectively through different media. It was a fun experience and as a result I am more confident in my Tiffany Inglis research ability.
University of Waterloo Grace Hopper 2011

Check the SRC Submission Dates:

http://src.acm.org/submissions.html

u u u

Participants receive: $500 (USD) travel expenses Winners receive: a handsome medal, monetary reward, and are eligible for the SRC Grand Finals Grand Finals Winners receive: a handsome certificate and monetary award at the ACM Awards Banquet

Questions?
Contact Nanette Hernandez, ACMs SRC Coordinator: hernandez@hq.acm.org

PYTHON MIGR ATION PAT TERN


[CS 0/1] [CS 2]

d we oo k e i v tb t-re Tex on Besh on maz t Py on A

es tur c u g Str s in ta ok U on a D Bo th Py

om Fr hon Pyt C++ to

[ALL PY THON]

[PY THON TO C++]

Python Programming: An introduction to computer science by John Zelle

Also available . . .

Problem Solving with Algorithms and Data Structures Using Python by Bradley Miller & David Ranum

Data Structures and Algorithms Using Python and C++ by David Reed & John Zelle

Coming soon . . .
Discrete Mathematics and Functional Programming by Thomas VanDrunen
. . . an innovative textbook that intertwines discrete structures with programming in the functional paradigm
REQUEST REVIEW COPIES/MORE INFORMATION

WWW.FBEEDLE.COM 1-800-322-2665
Formal Language: A practical introduction by Adam Webber Modern Programming Languages: A practical introduction by Adam Webber
FRANKLIN, BEEDLE & ASSOCIATES PROMOTES THESE FINE BOOKS AND SEEKS INNOVATIVE PROJECTS FOR COMPUTER SCIENCE COURSES. PLEASE CONTACT US FOR MORE INFORMATION.

FRANKLIN, BEEDLE & ASSOCIATES INC. 22462 SW WASHINGTON STREET SHERWOOD, OREGON 97140 [INDEPENDENT PUBLISHERS SINCE 1985]

photo: Micha L. Rieser