Computer Science Seminar Series
Permanent URI for this collection
The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge.
Browse
Recent Submissions
- The Case for Multidisciplinary Computer ScienceBurge, Jamika (2015-04-17)Multidisciplinary computer science approaches problem solving from a range of disciplines. Arguably, some of today’s most salient areas of technical research – social computing, data analytics (“big data”), and cyber security – are multidisciplinary in nature. Moreover, multidisciplinary computing has the unique quality of empowering technology users in ways that did not exist just ten years ago (think Google Glass and quantified self applications). In this talk, I share a series of research projects that have contributed to the line of multidisciplinary computing research. I will also share lessons learned and possible directions for future research.
- Promoting Service Design as a Critical Lens within HCIZimmerman, John (2015-02-06)HCI has a history of adding critical lenses in reaction to the kinds of things it makes. It started with a narrow focus on usability and then added a user-centered design (UCD) lens in order create tools that made people more effective at work. More recently it added a user experience (UX) lens in order to design products consumers desire. Today HCI promotes UCD and UX as core to what we do and who we are. Interestingly, work in both HCI research and practice involves new things that conflict with this identity and with the product-centric focus of UCD and UX. First, traditional brick and mortar services increasingly ask HCI teams to make customer-facing interfaces. This along with the rapid growth in Software as a Service means today’s HCI teams make more services than products. Second, work on social computing and on designing for social change frequently ask HCI teams to make systems that strongly influence or even radically change users’ behaviors in ways that have nothing to do with meeting their needs or desires. This work is often at odds with the core tenants of UCD and UX and with the idea that HCI plays the role of user advocate. I suggest that HCI needs to evolve by adding service design as critical new lens. Service design offers several benefits. It employs a design process meant to results in a service. In addition, this process helps design teams envision systemic solutions that meet the needs of many stakeholders linked together in complex relationships, providing a better fit to the challenges found in social computing and in design for social innovation. In this talk I discuss how HCI has historically evolved to meet the changing needs. I then discuss service design as a distinct design practice. Finally, I detail how service design helps address challenges in designing services, social computing systems, and systems intended to drive social change. Bio John Zimmerman is an interaction designer and researcher with a joint appointment as an Associate Professor at Carnegie Mellon’s HCI Institute and School of Design. His research has four main themes: (i) how to drive innovation of public services using social computing (ii) how changing system behavior can influence users’ perceptions of value for the system; (iii) research through design in HCI; and (iv) interaction with intelligent systems. Prior to joining Carnegie Mellon, John worked at Philips Research, investigating future interactive TV products and services.
- Virginia Tech Distinguished Lecture Series: Eric LyonLyon, Eric (2015-04-24)Associate Professor - Music Technology, Composition Eric Lyon is a composer and computer music researcher. Major areas of focus include computer chamber music, spatial orchestration, and articulated noise composition. Recent compositions include "Spirits", a 43-channel electroacoustic piano composition for the ZKM Kubus,"Noise Variations" for ensemble mise-en, and “The Book of Strange Positions” for the violin duo String Noise. Subject: Distinguished Lecture Series
- Virginia Tech Computer Science Faculty Interview Series: Dr. Kurt LutherLuther, Kurt (2015-04-16)I’m an assistant professor of Computer Science, a member of the Center for Human-Computer Interaction, a fellow of the Institute for Creativity, Arts, and Technology, and a faculty affiliate in Human Centered Design, all at Virginia Tech. My teaching and research focus on human–computer interaction (HCI), social computing, and crowdsourcing. I build and study systems that leverage the Internet to connect geographically distributed people with diverse skills and backgrounds. These social technologies can help people be more creative, make new discoveries, and solve complex problems. My work has applications across many domains, including movie and game production, graphic design, and citizen science. I’m also interested in connections to the digital humanities, especially history. Before coming to Virginia Tech, I was a postdoc in the HCI Institute at Carnegie Mellon University. I received my Ph.D. in Human-Centered Computing from Georgia Tech, where I was a Foley Scholar, and my undergraduate degree is from Purdue University, where I studied computer graphics, art, and design. I’ve also worked at Microsoft Research, IBM Research, Newgrounds, and YouTube (Google). http://www.kurtluther.com/
- Meta-models of ConfidentialityKafura, Dennis G. (2013-05-06)Professor Dennis Kafura's Presentation of Meta-models of Confidentiality. This presentation focuses on information security. The three key parts of information security being confidentiality, integrity, and availability. Confidentiality is insuring information is seen by the "right" people. Integrity is insuring the "right" information is seen. Availability is insuring information can be seen. Professor Kafura's webpage http://people.cs.vt.edu/~kafura/.
- Science at Extreme Scale: Challenges in Data Management, Analysis, and VisualizationNowell, Lucy Terry (2013-05-06)Management, analysis and visualization of extreme-scale scientific data will undergo radical change during the coming decade. Coupled with changes in the hardware architecture of next-generation supercomputers, explosive growth in the volume of scientific data presents a host of challenges to researchers in computer science, mathematics and statistics, and application sciences. Failure to develop new data management, analysis and visualization technologies that operate effectively on the changing architecture will cripple scientific discovery and put national security at risk. Using examples from climate science, Dr. Lucy Nowell will explore the technical and scientific drivers and opportunities for data science research funded by the Advanced Scientific Computing Research program in the Department of Energy’s Office of Science. BIO: Dr. Lucy Nowell is a Computer Scientist and Program Manager for the Advanced Scientific Computing Research (ASCR) program office in the Department of Energy’s (DOE) Office of Science. While her primary focus is on scientific data management, analysis and visualization, her portfolio spans the spectrum of ASCR computer science interests, including supercomputer architecture, programming models, operating and runtime systems, and file systems and input/output research. Before moving to DOE in 2009, Dr. Nowell was a Chief Scientist in the Information Analytics Group at Pacific Northwest National Laboratory (PNNL). On detail from PNNL, she held a two-year assignment as a Program Director for the National Science Foundation’s Office of Cyberinfrastructure, where her program responsibilities included Sustainable Digital Data Preservation and Access Network Partners (DataNet), Community-based Data Interoperability Networks (INTEROP), Software Development for Cyberinfrastructure (SDCI) and Strategic Technologies for Cyberinfrastructure (STCI). At PNNL, her research centered on applying her knowledge of visual design, perceptual psychology, human-computer interaction, and information storage and retrieval to problems of understanding and navigating in very large information spaces, including digital libraries. She holds several patents in information visualization technologies. Dr. Nowell joined PNNL in August 1998 after a career as a professor at Lynchburg College in Virginia, where she taught a wide variety of courses in Computer Science and Theatre. She also headed the Theatre program and later chaired the Computer Science Department. While pursuing her Master of Science and Doctor of Philosophy degrees in Computer Science at Virginia, she worked as a Research Scientist in the Digital Libraries Research Laboratory and also interned with the Information Access team at IBM's T. J. Watson Research Laboratories in Hawthorne, NY. She also has a Master of Fine Arts degree in Drama from the University of New Orleans and the Master of Arts and Bachelor of Arts degrees in Theatre from the University of Alabama.
- Virginia Tech Student PresentationsEndert, Alex; Sathre, Paul (2013-05-06)Sathre: In today's society we don't realize that parallel computing is everywhere. Parallel computing is no longer something that is just in supercomputers. Multiple devices such as laptops, tablets, and even cell phones take advantage of parallel computing nowadays. In this presentation Sathre gives an overview of how an accelerator(GPU) is used in comparison to CPU. He also talks about translation from CUDA to OpenCL. Endert's Website: http://people.cs.vt.edu/aendert/Alex_Endert/Home.html
- Cloud Computing and People with DisabilitiesLewis, Clayton (2013-04-25)The rise of services supported "in the cloud", on the worldwide population of interconnected computers, is revolutionizing many businesses, while providing consumers with increased convenience at lower cost. What does this revolution in technology offer to people with disabilities? This talk will describe work being done in the USA and abroad to realize the Global Public Inclusive Infrastructure (http://GPII.net), using the cloud to make it much easier for people to access online content and services in a way that meets their individual needs and preferences. It will also outline further implications of the cloud for improvement of services for people with disabilities, through advances in "Big Data" analytics, in data sharing technology, and in social software. Bio: Clayton Lewis is Professor of Computer Science and Fellow of the Institute of Cognitive Science at the University of Colorado. He is well known for his work (with students and colleagues) on evaluation methods in user interface design, including the thinking aloud and cognitive walkthrough methods. His recent work on technology for people with cognitive disabilities has been presented to the US Access Board Technical Advisory Committee, CSUN, RESNA, ACM ASSETS, and other forums, and he has served as Scientist in Residence at the Coleman Institute for Cognitive Disabilities. He is a member of the CHI Academy, recognizing his contributions to Human Computer Interaction. He is currently on leave from the University, serving as a consultant on cloud computing for the National Institute on Disability and Rehabilitation Research. Clayton's Website: http://spot.colorado.edu/~clayton/
- Hetrogenous Parallel ComputingFeng, Wu-chun (2013-04-25)With processor core counts doubling every 18-24 months and penetrating all markets from high-end servers in supercomputers to desktops and laptops down to even mobile phones, we sit at the dawn of a world of ubiquitous parallelism, one where extracting performance via parallelism is paramount. That is, the "free lunch" to better performance, where programmers could rely on substantial increases in single-threaded performance to improve software, is over. The burden falls on developers to exploit parallel hardware for performance gains. But how do we lower the cost of extracting such parallelism, particularly in the face of the increasing heterogeneity of processing cores? To address this issue, this talk will present a vision for an ecosystem for delivering accessible and personalized supercomputing to the masses, one with a heterogeneity of (hardware) processing cores on a die or in a package, coupled with enabling software that tunes the parameters of the processing cores with respect to performance, power, and portability via a benchmark suite of computational dwarfs and applications. Bio: Wu Feng is an Elizabeth & James Turner Fellow and Associate Professor of Computer Science and Electrical & Computer Engineering at Virginia Tech, where he directs the Synergy Laboratory and serves as the VT site co-director of the NSF Center for High-Performance Reconfigurable Computing (CHREC). He is also an adjunct faculty in the Virginia Bioinformatics Institute at Virginia Tech and in the Dept. of Cancer Biology and Translational Science Institute at Wake Forest University. His research interests in efficient parallel computing sits at the synergistic intersection of computer architecture, systems software, middleware, and applications software and ranges from core computer science research to highly interdisciplinary research. Of recent note is a new supercomputing resource that will support the above research and research campus-wide: HokieSpeed. Dr. Feng received a B.S. in Electrical & Computer Engineering and Music (Honors) and an M.S. in Computer Engineering from Penn State University in 1988 and 1990, respectively. He earned a Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign in 1996. His previous professional stints include IBM T.J. Watson Research Center, NASA Ames Research Center, Vosaic, University of Illinois at Urbana-Champaign, Purdue University, The Ohio State University, Orion Multisystems, and Los Alamos National Laboratory. He is a Distinguished Member of the ACM, Senior Member of the IEEE, and two-time designee of HPCwire's Top People to Watch List. Feng's Website: http://people.cs.vt.edu/~feng/
- Scaling Secure ComputationEvans, David (2013-04-25)Alice and Bob meet in a campus bar in 2017. Being typical Virginia Tech students, they both have their genomes stored on their mobile devices and, before expending any unnecessary effort in courtship rituals, they want to perform a genetic analysis to ensure that their potential offspring would have strong immune systems and not be at risk for any recessive diseases. But Alice doesn't want Bob to learn about her risk for Alzheimer's disease, and Bob is worried a future employer might misuse his propensity to alcoholism. Two-party secure computation provides a way to solve this problem. It allows two parties to compute a function that depends on inputs from both parties, but reveals nothing except the output of the function. A general solution to this problem have been known since Yao's pioneering work on garbled circuits in the 1980s, but only recently has it become conceivable to use this approach in real systems. Our group has developed a framework for building efficient and scalable secure computations that achieves orders of magnitude performance improvements over the best previous systems. In this talk, I'll describe the techniques we use to design scalable and efficient secure computation protocols and report on some recent results in improving the security and performance of secure computing applications. Bio: David Evans is an Associate Professor of Computer Science at the University of Virginia. He won the Outstanding Faculty Award from the State Council of Higher Education for Virginia in 2009, an All-University Teaching Award in 2008, and was Program Co-Chair for the 2009 and 2010 IEEE Symposia on Security and Privacy. He is the author of an open introductory computing textbook (http://www.computingbook.org) and has taught open CS101 and Applied Cryptography courses for Udacity, enrolling over 250,000 in his CS101 course (https://www.udacity.com/course/cs101). He has SB, SM and PhD degrees in Computer Science from MIT. Evans's Website: http://www.cs.virginia.edu/~evans/
- Design-Driven Software DevelopmentConsel, Charles (2013-04-25)Raising the level of abstraction beyond programming is a very active research topic involving a range of areas, including software engineering, programming languages and formal verification. The challenge is to allow design dimensions of a software system, both functional and non-functional, to be expressed in a high-level way, instead of being encoded with a programming language. Such design dimensions can then be leveraged to verify conformance properties and to generate programming support. Our research on this topic is to take up this challenge with an approach inspired by programming languages, introducing a full-fledged language for designing software systems and processing design descriptions both for verification and code generation purposes. Our approach is also inspired by domain-specific languages in that it defines a conceptual framework to guide software development. Lastly, to make our approach practical to software developers, we introduce a methodology and a suite of tools covering the development life-cycle. This talk gives an overview of our approach and presents our main research results, illustrated by concrete examples. BIO: Charles Consel is a professor of Computer Science at University of Bordeaux. He served on the faculty of Yale University, Oregon Graduate Institute and the University of Rennes. He leads the Phoenix research group at INRIA. He has been designing and implementing Domain-Specific Languages (DSLs) for a variety of areas including device drivers, programmable routers, stream processing, and telephony services. These DSLs have been validated with real-sized applications and showed measureable benefits compared to applications written in general-purpose languages. His research interests include programming languages, software engineering, distributed systems and operating systems. Consel's Website: http://phoenix.inria.fr/charles-consel
- Using Process Definition and Analysis Techniques to Reduce Errors and Improve Efficiency in the Delivery of HealthcareClarke, Lori; Osterweil, Leon (2013-04-25)As has been widely reported in the news lately, heathcare errors are a major cause of death and suffering, and healthcare inefficiencies result in escalating costs. In the University of Massachusetts Medical Safety Project, we are investigating if process definition and analysis technologies can be used to help reduce heathcare errors and improve heathcare efficiency. Specifically, we are modeling healthcare processes using a process definition language and then analyzing these processes using model checking, fault-tree analysis, discrete event simulation, and other analysis techniques. Working with the UMASS School of Nursing and the Baystate Medical Center, we are undertaking in-depth case studies on error-prone and life-critical healthcare processes. In many ways, these processes are similar to complex, distributed systems with many interacting, concurrent threads and numerous exceptional conditions that must be handled carefully. This talk describes the technologies we are using, discusses case studies, and presents our observations and findings to date. Although presented in terms of the healthcare domain, the described approach could be applied to human-intensive processes in other domains to provide a technology-driven approach to process improvement. Bio: Lori A. Clarke is chair the Department of Computer Science at the University of Massachusetts, Amherst, and co-director of the Laboratory for Advanced Software Engineering Research (LASER). She is a Fellow of the ACM and IEEE, and a board member of the Computing Research Association’s Committee on the Status of Women in Computing Research (CRA-W). She is a former vice chair of the Computing Research Association (CRA), co-chair of CRA-W, IEEE Publication Board member, associate editor of ACM TOPLAS and IEEE TSE, member of the CCR NSF advisory board, ACM SIGSOFT secretary/treasurer, vice-chair and chair, IEEE Distinguished Visitor, and ACM National Lecturer. She received the 2011 University of Massachusetts Outstanding Accomplishments in Research and Creative Activity Award, the 2009 College of Natural Sciences and Mathematics Outstanding Faculty Service Award, the 2004 University of Colorado, Boulder Distinguished Engineering Alumni Award, the 2002 SIGSOFT Distinguished Service Award, a 1993 University Faculty Fellowship, and the 1991 University of Massachusetts Distinguished Faculty Chancellor's Medal. She has written numerous papers, served on many program committees, and was program co-chair of the 14th and general chair of the 25th International Conference on Software Engineering. She has been a Principal Investigator on a number of NSF, DoD, and DARPA projects. Dr. Clarke's research is in the area of software engineering, primarily focusing on finite-state verification of concurrent systems and requirements engineering. Recently she has been investigating applying software engineering technologies to detect errors and vulnerabilities in complex processes in domains such as healthcare, scientific workflow, and digital government. She is also involved in several efforts to increase participation of underrepresented groups in computing research. Leon J. Osterweil is a professor in the Department of Computer Science, co-director of the Laboratory for Advanced Software Engineering Research (LASER), and founding co-director of the Electronic Enterprise Institute, all at the University of Massachusetts Amherst, where he also served as Dean of the College of Natural Sciences and Mathematics from 2001-02005. Previously he had been a Professor in, and Chair of, Computer Science Departments at both the University of California, Irvine, and the University of Colorado, Boulder. Professor Osterweil was awarded the ACM SIGSOFT Outstanding Research Award for Lifetime Excellence in Research in 2003 and the ACM SIGSOFT Most Influential Educator Award in 2010. His ICSE 9 paper was awarded a prize as the Most Influential Paper of ICSE 9, awarded as a 10-year retrospective. Prof. Osterweil is a Fellow of the Association for Computing Machinery. He is a member of the editorial boards of IEEE Transactions on Software Engineering, Software Process Improvement and Practice, Automated Software Engineering, and the International Journal of Software and Informatics. Prof. Osterweil chaired a National Academy of Sciences committee that studied strategies for improving electronic services provision for the US Social Security Administration, and is currently serving on an NAS committee investigating similar issues for the Centers for Medicare and Medicaid Services. He has presented keynote talks at a variety of meetings, including ICSE 9 (the Ninth International Conference on Software Engineering) where he introduced the concept of Process Programming. Prof. Osterweil has been the Program Committee Chair for such conferences as the 16th International Conference on Software Engineering, and was the General Chair of the Sixth ACM Sigsoft Conference on the Foundations of Software Engineering, and the 28th International Conference on Software Engineering (ICSE 2006). Professor Osterweil’s research focuses on the definition, analysis, and iterative improvement of processes. He led the project to develop the Little-JIL process definition language. His work has been supported by a variety of sources, most principally by numerous grants from both the National Science Foundation and the Defense Advanced Research Projects Agency. His research career is summarized in the book, The Engineering of Software, published in 2011 by Springer. Clarke's Website: http://laser.cs.umass.edu/people/clarke.html Osterweil's Website: http://laser.cs.umass.edu/people/ljo.html
- Towards a Materiality of Information (CS Seminar Lecture Series)Dourish, Paul (2012-03-30)In this talk, I'd like to sketch some very preliminary ideas that I'm beginning to shape into a research program for the next few years. They revolve around the materiality of digital information. In the humanities and social sciences, the last few years have seen a rise in interest in "materiality" -- an examination of the nature and consequences of the material forms of objects of social and cultural import. There are many different things that one might mean when talking of the materiality of digital information -- everything from why iPods have a different cultural cache than Zunes (the domain of material culture) to how urban landscapes are reshaped by the material constraints of high-capacity network wiring or wireless access patterns (the domain of human geography). At the moment, my particular interest is in the consequences of the fact that information -- which we generally talk about as if it were ineffable and abstract -- is something that we encounter only ever in material form, and that our information practices (the things we know how to do, as information scientists) are inextricably entwined with these material forms, both substrates (media) and representations (conventional patterns). Bio: Paul Dourish is a Professor of Informatics in the Donald Bren School of Information and Computer Sciences at UC Irvine, with courtesy appointments in Computer Science and Anthropology. His research focuses primarily on understanding information technology as a site of social and cultural production; his work combines topics in human-computer interaction, ubiquitous computing, and science and technology studies. He has published over 100 scholarly articles, and was elected to the CHI Academy in 2008 in recognition of his contributions to Human-Computer Interaction. He is the author of two books: "Where the Action Is: The Foundations of Embodied Interaction" (MIT Press, 2001), which explores how phenomenological accounts of action can provide an alternative to traditional cognitive analysis for understanding the embodied experience of interactive and computational systems; and, with Genevieve Bell, "Divining a Digital Future: Mess and Mythology in Ubiquitous Computing" (MIT Press, 2011), which examines the social and cultural aspects of the ubiquitous computing research program. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Dynamical Processes on Large Networks (CS Seminar Lecture Series)Prakash, B. Aditya (2012-03-23)How do contagions spread in population networks? Which group should we market to, for maximizing product penetration? Will a given YouTube video go viral? Who are the best people to vaccinate? What happens when two products compete? Any insights on these problems, involving dynamical processes on networks, promise great scientific as well as commercial value. In this talk, we present a multi-pronged attack on such research questions, which includes: (a) Theoretical results on the tipping-point behavior of fundamental models; (b) Scalable Algorithms for changing the behavior of these processes, like for immunization, marketing etc.; and (c) Empirical Studies on tera-bytes of data for developing more realistic information-diffusion models. The problems we focus on are central in surprisingly diverse areas: from cyber-security, epidemiology and public health, viral marketing to spreading of hashtags on Twitter and propagation of memes on blogs. B. Aditya Prakash (http://www.cs.cmu.edu/~badityap) is a Ph.D. student in the Computer Science Department at Carnegie Mellon University. He got his B.Tech (in CS) from the Indian Institute of Technology (IIT) - Bombay. He has published 14 refereed papers in major venues and holds two U.S. patents. His interests include Data Mining, Applied Machine Learning and Databases, with emphasis on large real-world networks and time-series. Some of the inter-disciplinary questions he investigates deal with identifying the precise role of networks in diffusion of contagion (like viruses, products, ideas). The mission of his research is to enable us to understand and eventually influence such processes for our benefit. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Symantec's WINE System for Repeatable, Data-Intensive Experiments in Cyber Security (CS Seminar Lecture Series)Dumitraş, Tudor (2012-03-16)Symantec's WINE System for repeatable, data-intensive experiments in cyber security The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Pratical Resource Assignment in Dynamic Wireless Networks (CS Seminar Lecture Series)MacKenzie, Allen B. (2012-01-27)Efforts to create modern wireless networks have occasionally suffered from approaches that seeks to replace static resource allocation schemes with fully dynamic schemes, failing to adequately compensate for the benefits associated with stable, predictable resource allocation, such as reduced communication overhead and computational complexity. In this talk, I describe ongoing research on channel assignment in multihop, multitransceiver wireless networks that demonstrates that many of the advantages of dynamic assignment are available via a hybrid approach that builds a static network topology and then enhances it dynamically in response to network traffic. Then, I will briefly describe future work that seeks to apply a broadly similar approach to spectrum assignment. In the first portion of the talk, I describe a proposed channel assignment scheme for cognitive radio networks that balances the need for topology adaptation to maximize flow rate and the need for a stable baseline topology to support network connectivity. We focus on networks in which nodes are equipped with multiple radios or transceivers, each of which can be assigned to a channel. First, we assign channels independently of traffic, to achieve basic network connectivity and support light loads such as control traffic, and, second, we dynamically assign channels to the remaining transceivers in response to traffic demand. We formulate the problem as a two-stage mixed integer linear program (MILP) and show that with this two-stage approach we can achieve performance comparable to a fully dynamic channel assignment scheme while preserving a static, connected topology. I describe ongoing work to implement this strategy via distributed channel assignment algorithms. In the second portion of the talk, I will describe a similar problem faced in the realm of spectrum assignment. Classical, static approaches to spectrum allocation are extremely inefficient, but provide a stable environment for wireless systems. Dynamic spectrum access (DSA) has been a popular research topic in the last five years, but deployment of DSA systems has been slowed by difficult technical challenges at multiple layers of the protocol stack and delayed adoption by spectrum regulators. I will briefly describe future research which will investigate hybrid approaches with the potential to offer both stability and improved efficiency. Bio: Allen B. MacKenzie received his bachelor's degree in Electrical Engineering and Mathematics from Vanderbilt University in 1999. In 2003 he earned his Ph.D. in electrical engineering at Cornell University and joined the faculty of the Bradley Department of Electrical and Computer Engineering at Virginia Tech, where he is now an associate professor. Prof. MacKenzie's research focuses on wireless communications systems and networks. His current research interests include cognitive radio and cognitive network algorithms, architectures, and protocols and the analysis of such systems and networks using game theory. His past and current research sponsors include the National Science Foundation, the Defense Advanced Research Projects Agency, and the National Institute of Justice. Prof. MacKenzie is an associate editor of the IEEE Transactions on Communications and the IEEE Transactions on Mobile Computing. He also serves on the technical program committee of several international conferences in the areas of communications and networking, and is a regular reviewer for journals in these areas. Prof. MacKenzie is a senior member of the IEEE and a member of the ASEE and the ACM. In 2006, he received the Dean's Award for Outstanding New Assistant Professor in the College of Engineering at Virginia Tech. He is the author of more than 45 refereed conference and journal papers and the co-author of the book Game Theory for Wireless Engineers. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Runtime Systems: Taming the High Performance Computing Beast (CS Seminar Lecture Series)Ribbens, Calvin J. (2012-02-03)High performance computing (HPC) is an area of computer science and engineering that has always evolved rapidly---sometimes leading and sometimes riding succeeding waves of technical innovation. While HPC application developers and users have continued to benefit from the increasing power of these high-end resources, the increasing complexity of HPC execution environments will require more and more reliance on runtime systems. Parallelism, load-balancing, power, fault-tolerance, and hardware heterogeneity are just a few of the emerging dominant issues that require runtime solutions. In this talk I will briefly describe some of the motivations and trends in runtime systems for HPC. I will then describe two recent projects we have worked on at Virginia Tech. The first, ReSHAPE, is a runtime system that allows the number of nodes assigned to job running on a cluster to be changed at run time. Experimental results from a prototype implementation of ReSHAPE illustrate the potential of "malleable" jobs for improving overall cluster utilization and reducing turn-around time for individual jobs. The second project, Samhita, is a distributed shared memory (DSM) execution environment, which allow programs based on the widely used Pthreads library for shared memory thread parallelism to be easily ported to a distributed memory (cluster) platform. Samhita not only allows a wide range of parallel codes to be ported to a new context, but its design reduces the problem of DSM to a cache management problem, with corresponding opportunities for exploiting locality at runtime. Bio: Cal Ribbens is Associate Professor and Associate Department Head for Undergraduate Studies in the Department of Computer Science at Virginia Tech. He received a B.S. in Mathematics from Calvin College (1981) and a Ph.D. in Computer Sciences from Purdue University (1986). His research interests include parallel computation, numerical algorithms, mathematical software, and tools and environments for high performance computing. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Code as a Metaphor for Computational Thinking (CS Seminar Lecture Series)Astrachan, Owen (2012-02-24)From an educational standpoint Computer Science has embraced the phrase 'Computational Thinking' as part of defining what our students should do. The National Academies and the National Research Council call for standards based on Computational Thinking. The National Science Foundation has required that Computational Thinking be addressed in many grants and programs. What is Computational Thinking? It may be that we cannot define it precisely, but just as Supreme Court Justice Potter Steward said of pornography we "know it when we see it". In this talk I will use code as a metaphor for explaining efforts to make sure that computational thinking is infusing education in K-12, colleges, and universities. I will talk about the code of software and the code of law-and-protocols and how they can be viewed and used together in courses, programs, and projects both at local and national levels. I will explain using concrete examples and stories why this metaphor can be empowering both to us and to our students. BIO: Owen Astrachan is the Director of Undergraduate Studies in Computer Science and Professor of the Practice at Duke where he has taught for more than twenty years. He taught mathematics and computer science in high school for seven years and earned an AB in mathematics from Dartmouth and MAT, MS, and PhD degrees from Duke. Professor Astrachan builds curricula and approaches to teaching computer science. This includes an NSF-sponsored, apprentice-learning approach between Duke, Appalachian State, and North Carolina Central and an NSF CAREER Award to incorporate Design Patterns in courses. He was involved early in AP Computer Science: as teacher, as member of the development committee, and as the Chief Reader. He is the PI on the CS Principles project to create a broader, more accessible AP course in computer science. In 1995 he received Duke's Robert B. Cox Distingished Teaching in Science Award, in 1998 he received the Outstanding Instructor Award while on sabbatical at the University of British Columbia, in 2002 he received Duke's Richard K. Lublin award for "ability to engender genuine intellectual excitement, ability to engender curiosity, knowledge of field and ability to communicate that knowledge", and in 2007 he was an inaugural recipient of the NSF/CISE Distinguished Education Fellow award. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Machine Learning in the Bandit Setting: Algorithms, Evaluation, and Case Studies (CS Seminar Lecture Series)Li, Lihong (2012-02-10)Much of machine-learning research is about discovering patterns---building intelligent agents that learn to predict future accurately from historical data. While this paradigm has been extremely successful in numerous applications, complex real-world problems such as content recommendation on the Internet often require the agents to learn to act optimally through autonomous interaction with the world they live in, a problem known as reinforcement learning. Using a news recommendation module on Yahoo!'s front page as a running example, the majority of the talk focuses on the special case of contextual bandits that have gained substantial interests recently due to their broad applications. We will highlight a fundamental challenge known as the exploration/exploitation tradeoff, present a few newly developed algorithms with strong theoretical guarantees, and demonstrate their empirical effectiveness for personalizing content recommendation at Yahoo!. At the end of the talk, we will also summarize (briefly) our earlier work on provably data-efficient algorithms for more general reinforcement-learning problems modeled as Markov decision processes. Bio: Lihong Li is a Research Scientist in the Machine Learning group at Yahoo! Research. He obtained a PhD degree in Computer Science from Rutgers University, advised by Michael Littman. Before that, he obtained a MSc degree from the University of Alberta, advised by Vadim Bulitko and Russell Greiner, and BE from the Tsinghua University. In the summers of 2006-2008, he enjoyed interning at Google, Yahoo! Research, and AT&T Shannon Labs, respectively. His main research interests are in machine learning with interaction, including reinforcement learning, multi-armed bandits, online learning, active learning, and their numerous applications on the Internet. He is the winner of an ICML'08 Best Student Paper Award, a WSDM'11 Best Paper Award, and an AISTATS'11 Notable Paper Award. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.
- Computational Challenges in Space Research (CS Seminar Lecture Series)Baker, Joseph B. H. (2012-04-20)The Center for Space Science and Engineering Research (Space@VT) in the College of Engineering is a relatively new center having been initiated in summer 2007. At the present time Space@VT comprises twelve Faculty in the ECE and AOE departments and approximately 30 graduate students and postdoctoral associates. Space@VT research and education activities are focused on developing improved scientific understanding of the near-Earth space environment and expanding its technological exploitation for societal needs. In this presentation I will provide an overview of Space@VT research activities with a particular emphasis on those aspects that touch on computational issues. The intent is to expand the conversation with CS faculty beyond the ad-hoc collaborations that are currently ongoing and, hopefully, generate new collaborations. Some of the themes that will be covered in the presentation include: (1) data mining the archive of space physics datasets for enhanced scientific productivity, (2) the necessity for development of new compression algorithms for data downlinks and attitude control on small university-built satellites (i.e. CubeSats), and (3) high performance numerical simulations of the near-Earth space plasma environment. Bio: Joseph Baker is an Assistant Professor in the Bradley Department of Electrical and Computer Engineering at Virginia Tech, and a member of the Center for Space Science and Engineering Research (Space@VT). Dr. Baker's current research uses data from the Super Dual Auroral Radar Network (SuperDARN) in conjunction with other ground- and space-based datasets to investigate electromagnetic coupling in the near-Earth space environment between the solar wind, the magnetosphere, and the ionosphere (or "space weather"). Prior to joining Virginia Tech in 2008, Dr. Baker was a Senior Staff Scientist at the Johns Hopkins University Applied Physics Laboratory. He received his Ph.D. in Atmospheric and Space Sciences from the University of Michigan in 2001, and his B.Sc. in Physics from the University of New England (Australia) in 1994. In 2011, Dr. Baker was named the Steven O. Lane Junior Faculty Fellow by the Virginia Tech Board of Visitors and an Outstanding New Assistant Professor in the College of Engineering. Dr. Baker is a member of the American Geophysical Union (AGU) and serves on its Education Award Committee. He is also a 2012 NSF CAREER award recipient. The Computer Science Seminar Lecture Series is a collection of weekly lectures about topics at the forefront of contemporary computer science research, given by speakers knowledgeable in their field of study. These speakers come from a variety of different technical and geographic backgrounds, with many of them traveling from other universities across the globe to come here and share their knowledge. These weekly lectures were recorded with an HD video camera, edited with Apple Final Cut Pro X, and outputted in such a way that the resulting .mp4 video files were economical to store and stream utilizing the university's limited bandwidth and disk space resources.