Commit a bunch of talks.

This commit is contained in:
Marc Burns 2012-02-13 00:35:12 -05:00
parent c4bdee913b
commit 488934d515
1 changed files with 271 additions and 4 deletions

View File

@ -11,12 +11,279 @@
CD or DVD should you so choose.
<ul class="media">
<!-- encode failed???
<mediaitem title="How Browsers Work">
<abstract>
<p>
Veteran Mozilla engineer Ehsan Akhgari presents a talk on the internals of web browsers.
The material ranges from the fundamentals of content rendering to the latest innovations in browser design.
</p>
<p>
Web browsers have evolved. From their humble beginnings as simple HTML
rendering engines they have grown and evolved into rich application
platforms. This talk will start with the fundamentals: how a browser
creates an on-screen representation of the resources downloaded from
the network. (Boring, right? But we have to start somewhere.) From
there we'll get into the really exciting stuff: the latest innovations
in Web browsers and how those innovations enable — even encourage —
developers to build more complex applications than ever before. You'll
see real-world examples of people building technologies on top of
these "simple rendering engines" that seemed impossible a short time
ago.
</p>
</abstract>
<presentor>Ehsan Akhgari</presentor>
<thumbnail file="how-browsers-work-thumb-small.jpg" />
<mediafile file="how-browsers-work.mp4" type="Talk (x264)" />
<mediafile file="how-browsers-work.mpg" type="Talk (MPG)" />
<flvfile file="how-browsers-work.flv" />
<other>
<p>
Ehsan Akhgari has contributed to the Mozilla project for more than 5
years. He has worked on various parts of Firefox, including the user
interface and the rendering engine. He originally implemented Private
Browsing in Firefox. Right now he's focusing on the editor component
in the Firefox engine.
</p>
</other>
</mediaitem>
-->
<mediaitem title="General Purpose Computing on Graphics Cards">
<abstract>
<p>
GPGPU (general purpose graphics processing unit) computing is an
expanding area of interest, with applications in physics, chemistry,
applied math, finance, and other fields. nVidia has created an
architecture named CUDA to allow programmers to use graphics cards
without having to write PTX assembly or understand OpenGL. CUDA is
designed to allow for high-performance parallel computation controlled
from the CPU while granting the user fine control over the behaviour
and performance of the device.
</p>
<p>
In this talk, I'll discuss the basics of nVidia's CUDA architecture
(with most emphasis on the CUDA C extensions), the GPGPU programming
environment, optimizing code written for the graphics card, algorithms
with noteworthy performance on GPU, libraries and tools available to
the GPGPU programmer, and some applications to condensed matter
physics. No physics background required!
</p>
</abstract>
<presentor>Katie Hyatt</presentor>
<thumbnail file="kshyatt-gpgpu-thumb-small.jpg" />
<mediafile file="kshyatt-gpgpu.mp4" type="Talk (x264)" />
<mediafile file="kshyatt-gpgpu-480p.mp4" type="Talk (x246 480p)" />
<flvfile file="kshyatt-gpgpu-480p.mp4" />
</mediaitem>
<!-- encoding problems
<mediaitem title="Analysis of randomized algorithms via the probabilistic method">
<abstract>
<p>
In this talk, we will give a few examples that illustrate the basic method and show how it can be used to prove the existence of objects with desirable combinatorial properties as well as produce them in expected polynomial time via randomized algorithms. Our main goal will be to present a very slick proof from 1995 due to Spencer on the performance of a randomized greedy algorithm for a set-packing problem. Spencer, for seemingly no reason, introduces a time variable into his greedy algorithm and treats set-packing as a Poisson process. Then, like magic, he is able to show that his greedy algorithm is very likely to produce a good result using basic properties of expected value.
</p>
<p>
The probabilistic method is an extremely powerful tool in combinatorics that can be
used to prove many surprising results. The idea is the following: to prove that an
object with a certain property exists, we define a distribution of possible objects
and use show that, among objects in the distribution, the property holds with
non-zero probability. The key is that by using the tools and techniques of
probability theory, we can vastly simplify proofs that would otherwise require very
complicated combinatorial arguments.
</p>
<p>
As a technique, the probabilistic method developed rapidly during the latter half of
the 20th century due to the efforts of mathematicians like Paul Erdős and increasing
interest in the role of randomness in theoretical computer science. In essence, the
probabilistic method allows us to determine how good a randomized algorithm's output
is likely to be. Possibly applications range from graph property testing to
computational geometry, circuit complexity theory, game theory, and even statistical
physics.
</p>
<p>
In this talk, we will give a few examples that illustrate the basic method and show
how it can be used to prove the existence of objects with desirable combinatorial
properties as well as produce them in expected polynomial time via randomized
algorithms. Our main goal will be to present a very slick proof from 1995 due to
Spencer on the performance of a randomized greedy algorithm for a set-packing
problem. Spencer, for seemingly no reason, introduces a time variable into his
greedy algorithm and treats set-packing as a Poisson process. Then, like magic,
he is able to show that his greedy algorithm is very likely to produce a good
result using basic properties of expected value.
</p>
<p>
Properties of Poisson and Binomial distributions will be applied, but I'll remind
everyone of the needed background for the benefit of those who might be a bit rusty.
Stat 230 will be more than enough. Big O notation will be used, but not excessively.
</p>
</abstract>
<presentor>Elyot Grant</presentor>
<thumbnail
<mediafile
<flvfile
</mediaitem>
not finished encoding
<mediaitem title="Machine learning vs human learning - will scientists become obsolete?">
<abstract><p></p></abstract>
<presentor>Dr. Shai Ben-David</presentor>
<thumbnail
<mediafile
<flvfile
</mediaitem>
-->
<mediaitem title="How to build a brain: From single neurons to cognition">
<abstract>
<p>
Theoretical neuroscience is a new discipline focused on constructing mathematical models of brain function. It has made significant headway in understanding aspects of the neural code. However, past work has largely focused on small numbers of neurons, and so the underlying representations are often simple. In this talk I demonstrate how the ideas underlying these simple forms of representation can underwrite a representational hierarchy that scales to support sophisticated, structure-sensitive representations.
</p>
<p>
Theoretical neuroscience is a new discipline focused on constructing
mathematical models of brain function. It has made significant
headway in understanding aspects of the neural code. However,
past work has largely focused on small numbers of neurons, and
so the underlying representations are often simple. In this
talk I demonstrate how the ideas underlying these simple forms of
representation can underwrite a representational hierarchy that
scales to support sophisticated, structure-sensitive
representations. I will present a general architecture, the semantic
pointer architecture (SPA), which is built on this hierarchy
and allows the manipulation, processing, and learning of structured
representations in neurally realistic models. I demonstrate the
architecture on Progressive Raven's Matrices (RPM), a test of
general fluid intelligence.
</p>
</abstract>
<presentor>Dr. Chris Eliasmith</presentor>
<thumbnail file="how-to-build-a-brain-thumb-small.jpg" />
<mediafile file="how-to-build-a-brain.mp4" type="Talk (x264)" />
<mediafile file="how-to-build-a-brain.mpg" type="Talk (MPG)" />
<flvfile file="how-to-build-a-brain.flv" />
</mediaitem>
<mediaitem title="BareMetal OS">
<abstract>
<p>
BareMetal is a new 64-bit OS for x86-64 based computers. The OS is written entirely in Assembly, while applications can be written in Assembly or C/C++. High Performance Computing is the main target application.
</p>
</abstract>
<presentor>Ian Seyler, Return to Infinity</presentor>
<thumbnail file="bare-metal-os-thumb-small.jpg" />
<mediafile file="bare-metal-os.mp4" type="Talk (x264)" />
<mediafile file="bare-metal-os.mpg" type="Talk (MPG)" />
<flvfile file="bare-metal-os.flv" />
</mediaitem>
<mediaitem title="A Brief Introduction to Video Encoding">
<abstract>
<p>
In this talk, I will go over the concepts used in video encoding (such as motion estimation/compensation, inter- and intra- frame prediction, quantization and entropy encoding), and then demonstrate these concepts and algorithms in use in the MPEG-2 and the H.264 video codecs. In addition, some clever optimization tricks using SIMD/vectorization will be covered, assuming sufficient time to cover these topics.
</p>
<p>
With the recent introduction of digital TV and the widespread success
of video sharing websites such as youtube, it is clear that the task
of lossily compressing video with good quality has become important.
Similarly, the complex algorithms involved require high amounts of
optimization in order to run fast, another important requirement for
any video codec that aims to be widely used/adopted.
</p>
</abstract>
<presentor>Peter Barfuss</presentor>
<thumbnail file="pbarfuss-video-encoding-thumb-small.jpg" />
<mediafile file="pbarfuss-video-encoding.mp4" type="Talk (x264)" />
<mediafile file="pbarfuss-video-encoding.mpg" type="Talk (MPG)" />
<flvfile file="pbarfuss-video-encoding.flv" />
</mediaitem>
<mediaitem title="Cooking for Geeks">
<abstract>
<p>
The CSC is happy to be hosting Jeff Potter, author of "Cooking for Geeks" for a presentation on the finer arts of food science.
Jeff's book has been featured on NPR, BBC and his presentations have wowed audiences of hackers &amp; foodies alike.
We're happy to have Jeff joining us for a hands on demonstration.
</p>
<p>
But you don't have to take our word for it... here's what Jeff has to say:
</p>
<p>
Hi! I'm Jeff Potter, author of Cooking for Geeks (O'Reilly Media, 2010), and I'm doing a "D.I.Y. Book Tour" to talk
about my just-released book. I'll talk about the food science behind what makes things yummy, giving you a quick
primer on how to go into the kitchen and have a fun time turning out a good meal.
Depending upon the space, Ill also bring along some equipment or food that we can experiment with, and give you a chance to play with stuff and pester me with questions.
</p>
</abstract>
<presentor>Jeff Potter</presentor>
<thumbnail file="cooking-for-geeks-thumb-small.jpg" />
<mediafile file="cooking-for-geeks.mp4" type="Talk (x264)" />
<mediafile file="cooking-for-geeks.mpg" type="Talk (MPG)" />
<flvfile file="cooking-for-geeks.flv" />
</mediaitem>
<mediaitem title="The Art of the Propagator">
<abstract>
<p>
We develop a programming model built on the idea that the basic computational elements are autonomous machines interconnected by shared cells through which they communicate. Each machine continuously examines the cells it is interested in, and adds information to some based on deductions it can make from information from the others. This model makes it easy to smoothly combine expression-oriented and constraint-based programming; it also easily accommodates implicit incremental distributed search in ordinary programs.
This work builds on the original research of Guy Lewis Steele Jr. and was developed more recently with the help of Chris Hanson.
</p>
</abstract>
<presentor>Gerald Jay Sussman</presentor>
<thumbnail file="sussman-propagator-thumb-small.jpg" />
<mediafile file="sussman-propagator.mkv" type="Talk (MKV)" />
<mediafile file="sussman-propagator-slides.pdf" type="Slides (PDF)" />
</mediaitem>
<mediaitem title="Why Programming is a Good Medium for Expressing Poorly Understood and Sloppily Formulated Ideas">
<abstract>
<p>
I have stolen my title from the title of a paper given by Marvin Minsky in the 1960s, because it most effectively expresses what I will try to convey in this talk.
</p>
<p>
We have been programming universal computers for about 50 years. Programming provides us with new tools to express ourselves. We now have intellectual tools to describe "how to" as well as "what is". This is a profound transformation: it is a revolution in the way we think and in the way we express what we think.
</p>
<p>
For example, one often hears a student or teacher complain that the student knows the "theory" of some subject but cannot effectively solve problems. We should not be surprised: the student has no formal way to learn technique. We expect the student to learn to solve problems by an inefficient process: the student watches the teacher solve a few problems, hoping to abstract the general procedures from the teacher's behavior on particular examples. The student is never given any instructions on how to abstract from examples, nor is the student given any language for expressing what has been learned. It is hard to learn what one cannot express. But now we can express it!
</p>
<p>
Expressing methodology in a computer language forces it to be unambiguous and computationally effective. The task of formulating a method as a computer-executable program and debugging that program is a powerful exercise in the learning process. The programmer expresses his/her poorly understood or sloppily formulated idea in a precise way, so that it becomes clear what is poorly understood or sloppily formulated. Also, once formalized procedurally, a mathematical idea becomes a tool that can be used directly to compute results.
</p>
<p>
I will defend this viewpoint with examples and demonstrations from electrical engineering and from classical mechanics.
</p>
</abstract>
<presentor>Gerald Jay Sussman</presentor>
<thumbnail file="sussman-why-programming-thumb-small.jpg" />
<mediafile file="sussman-why-programming.mkv" type="Talk (MKV)" />
<mediafile file="sussman-why-programming-slides.pdf" type="Slides (PDF)" />
</mediaitem>
<mediaitem title="A brief history of CS curriculum at UW">
<abstract><p>
Prabhakar Ragde presents a brief history of the CS curriculum at the University of Waterloo.
His talk explains major changes in the CS curriculum from 1970 to 2008.
</p></abstract>
<abstract>
<p>
I'll survey the evolution of our computer science curriculum over the
past thirty-five years to try to convey the reasons (not always entirely
rational) behind our current mix of courses and their division into core
and optional. After some remarks about constraints and opportunities in
the near future, I'll open the floor to discussion, and hope to hear
some candid comments about the state of CS at UW and how it might be
improved.
</p>
<p>
About the speaker:
</p>
<p>
Prabhakar Ragde is a Professor in the School of Computer Science at UW.
He was Associate Chair for Curricula during the period that saw the
creation of the Bioinformatics and Software Engineering programs, the
creation of the BCS degree, and the strengthening of the BMath/CS degree.
</p>
</abstract>
<presentor>Prabhakar Ragde</presentor>
<thumbnail file="prabhakar-history-of-uw-cs-thumb-small.jpg" />
<mediafile file="prabhakar-history-of-uw-cs.avi" type="Talk (XviD)" />