Responsibilities of Oracle DBA Role and responsibilities of DBA

Posted by Hollywood Updates

Responsibilities of DBA
If you want to become an oracle DBA, you should first understand what an Oracle DBA’s jobs is. The basic roles of the DBA are fairly consistent among different companies, but these duties might be expanded based on the size of the company and the experiences, of the DBA. In fact, the DBA is considered the main resource for DBMS experience and knowledge in many companies.

Let’s look at these roles and responsibilities and determine what skills are necessary to fulfill these duties. Here the roles and responsibilities are divided into two categories: basic duties and additional duties. The dividing line between these is not clear; there is significant overlap.
Basic Duties of the DBA

Here are some of the basic roles of the Oracle DBA. This is not an all-inclusive list. Depending on your installation and stall, your duties might not include all of these, or might include many more items. This section in simply intended as a general guide.
1)Installation of new software:

It is primarily the job of the DBA to install new versions of Oracle software, application software, and other software related to DBMS administration. It is important that the DBA or other IS staff ,members test this new software before it moved into a production environment.
2)Configuration of hardware/software administrator:

To many cases the system software can only be accessed by the system administrator. In this case, the DBA must work closely with the system administrator to perform software installations, and to configure with DBMS.
3)Security administration:

One of the main duties of the DBA is on monitor and administer DBMS security. This involves adding and removing users, administering quotas, auditing, and checking for security problems.
4)Performance Tuning and Monitoring:

The DBA must continually monitor system performance and be prepared to retune the system as necessary. Even a well-tuned system must be constantly monitored and adjusted. Sometimes this involves changing tuning parameters, other times this involves rebuilding an index or restructuring a table.
5)Backup and recovery:

Perhaps the most important responsibility of the DBA is effective the data in the system. To effectively do this, you must develop an effective backup and recovery strategy and make sure it is carried out. A DBA’s chief responsibility is to maintain the integrity of the database. It is important that the backup and recovery process be periodically tested.
6)Routine scheduled maintenance:

It is the job of the DBA to schedule routine DBMS maintenance and early out this maintenance and carry out this maintenance. This maintenance is regularly carried out in the early hours of the morning or on weekends when this maintenance causes the least inconvenience to the user community.
Additional Duties of the DBA

Some of the more advanced duties of the Oracle DBA might include the following:

1)Data analysis:

The DBA will frequently be called on to analyze the data stored in the database and to make recommendation relating to performance and efficiency of that data storage. This might relate to the more effective use of indexes or the use of some feature such as the Parallel Query option.
2)Database design:

The DBA is often involved at the preliminary database-design stages. Through the involvement of the DBA, many problems that might can be eliminated. The DBA knows the DBMS and system, can point out potential problems, and can help the development team with special performance considerations.
3)Data modeling and optimization:

By modeling the data, it is possible to optimize the system layout to make the most advantage of your I/O subsystem.
4)Assisting developers with SQL and stored procedure development:

The DBA should be prepared to be a resource for developers and users. The DBA is often called on to help with SQL problems as well as to design and write stored procedures.
5)Enterprise standards and naming conventions:

Because many different groups might perform different roles in developing and deploying applications, it is often the DBA who is called on to help define enterprise standards and naming conventions as well as to ensure that new application are conforming to these standards.
6)Development of production migration procedures:

Because the DBA is responsible for the availability and reliability of the DBMS and application using that DBMS, it is up to the DBA to develop and maintain procedures for rolling out new applications and DBMS software. This involves evaluating new software or patches as well as testing them. It is up to the DBA to guarantee the stability and robustness of the system.
7)Evaluation of new software:

The DBA might be called on to evaluated new software and make recommendations based on that evaluation. This might be related to a software purchase rollout of new version of software. This evaluation must be done in the context of the stability of the system. It is your responsibility to maintain system stability and reliability.
Labels: Database Managment Sys

Introduction

Posted by Hollywood Updates

Computer ! an amazing machine! We are living in the computer age today and most of our day activities cannot be a accomplished without using computers. sometimes knowingly and sometimes unknowingly we use computers. Whether we have to withdraw money from the ATM(Automated Teller Machines, retranslated as Any Time Money), publish a newsletter, drive a motorbike, design a building or even a new dress, go to a grocery shop and by from cookies to types for our car – all involve computer in one way or the other.
We are breathing in the computer age and gradually computer has becomes such are breathing in the computer age and gradually computer has become such a dire necessity of life that it is difficult life without it.
Computer is affecting every sphere of our life. Be it government, business, education, legal practice, entertainment, defense or home – computer has becomes an indispensable and multipurpose tool.

Need for Computer Literacy

Posted by Hollywood Updates

Computers have shaken up the world. They have made us dependent upon them. We expect them to be present at every place: be it the reservation counter, the microwave cooking or even driving a car. Now that computers have moved in our society so rapidly, one needs, at least the basic computer skills to pursue one’s career goals and function effectively and effectively. We can say that computer literacy is the need of today and voice of tomorrow to survive on the fast changing world of computers.

For most of the people computer literacy is restricted to using the keyboard for typing a document or making use of it for the calculations. But this is not enough. One must know the fundamental concept about what computers constitute of and how the they work. Take an example, you drive a motorbike and you take it for the servicing. There is one way that you tell the service boy to service the motorbike and I will collect it in the evening. The another way could be that you tell him that, apart from servicing, please change the engine oil and clean the carburetor too. This way the service man would be more careful while doing the job of your motorbike and you too will be able to solve at least some of the small problems when the machine is not available. The lack of knowledge can cause mistakes while using the computer also. Sometimes this lack of knowledge can cause mistakes while using the computer also. Lack of knowledge can cause mistakes while using the computer also. Sometimes this lack of knowledge cause some fear in the people. This is termed as CYBERPHOBIA

Computer Definition



For most of the people, computer is a machine used for calculations or computation, but actually, it is much more than that. Precisely, “Computer is a device for performing arithmetic and logical operations”, or “Computer is a device or flexible machine to process data and convert it into information”.
Now, in the above two definitions, three words are tricky that need some explanations. Let us start with logical operation-these are the type of operation in which decision(s) is/are involved. Data is simply raw fact or figure collected whereas the information is what one gets after processing the data. Information is always useful and meaningful for the user. Let us consider and example in which marks of various subjects are collected for a particular group of students. Now, these marks independently(data) are of no use as such for the class teacher, but once she he adds the marks of all the students and calculates their respective percentages, this becomes information and it serves her/him in finding out the answer for the queries like Who has 1st in the class? Or How many people have got distinctions in the class? What is the overall performance of the class? “A program is a set of instructions telling the computer what to do.”

Basic Anatomy of Computers

Posted by Hollywood Updates

Broadly computer can be said to be made up of hardware and software. The computer hardware(actual machine) is defined in such a way that it does whatever the software(computer programs) tells it to do.
There are 4 basic operation which a computer performs irrespective of the program which is running on it. They are classified as:-











1-Input :-





This is for the purpose of inserting or feeding data into the computer by means of an input device like keyboard.
















2- Processing:-





Some kind of processing is done in the computer to take out or transform the data in some way.


















3- Output:-






The computer produces output on a device, such as printer or out or monitor, that show the result of processing operation .












4- Storage:-










The computer stores the result of processing operations for future use in some storage device like hard disk or floppy disk.

0 comments:

The Computer Generation

Posted by Hollywood Updates

In recent years, the computer industry has grow at a phenomenal pace. In a short time of 35 years or so computers have improved tremendously. In the last decade the speed of computer has increase. The cast per unit of calculating has gone down by 500 times. The storage capacity is increasing so fast that now it seems that nothing is impossible to store. Large data can be stored in very small devices.



First Generation of computer(1942-1955):-


Until 1951, electronic computers were exclusive possession of scientists and the military. Till then nobody tried to use them for business purpose. The idea of marketing them was conceived mushily and Eckert, creators of ENIAC’S. as US census bureau was already using IBCP cards, they were the pioneers in US buying this computer for the first time in 1951. the company created by M and ETS become UNIVAC division of Sperry and Corporation (later known as UNISYS).


Computer belonging to this generation had the following characteristics:

1. Comparatively large in size as composed to present day computers.
2. Generated lot of heat, they were not consistent and reliable as the valves tended to fall frequently.
3. low capacity internal storage.
4. individual, non-related models.
5. processors operated in the milliseconds speed range.
6. internal storage consisted of magnetic drum and delay lines.






Second Generation (1955-1964):

FGC were very unreliable, mainly because of vacuum tubes which kept on burning out. Users had to be prepared all the time with dizen of extra tubes to replace them. The computers of this generation were characterized by the use of Solid State devices(transistors) misted of vacuum tubes. Transistorized circuits were smaller, generated little heat, were expensive and consumed less power than vacuum tube circuits and were much greater in processing capacity.

Since transistors had a faster switching action, this generation than first generation computers. The use of magnetic cores as the primary internal storage medium and the introduction of removable magnetic disc pack were other major developments of the second generation. Although magnetic tapes were still used commonly. These computers had built in error detecting devices and more efficient means were developed to input and retrieve from the computer.



Some of the popular models in this generation of computer systems, we IBM-1401, IBM-1620, BURROUGHS B-200 SERIES, HONEY-WELL H-400, these computers were used for business applications.



Third Generation of Computer(1964-1975)

A revolution in the computer developments took place with the development of integrated circuits (IC) on a single silicon chip. In 1958, jack St Clair Kebly and Robert Noyce invented the first IC. IC incorporated number of transistors and electronic circuits on a single wafer or chip of silicon IC is called chip beause of the way they are made. They are also called as semi conductors as combining layers of materials that have varying capacity to conduct electricity from them.

This ushered in the third generation of computer systems in 1964. the integrated circuits enhanced considerably the processing capability of placing 12 or more logic gates on a single chip was developed into a well-defined technology was redefined to a point where hundreds or more gates could be placed on a chip of silicon and incorporated as functional logic block in an overall system.


Computers of this generation have the following characteristicts:


1. Smaller in size as compared to second generation computers.
2. Higher capacity internal storage.
3. Remote communication facilities.
4. Multiprogramming facilities.
5. Reduced cost of access storage.
6. Processors, which operate in nanosecond speed range.
7. Use of high level languages such as COBOL.
8. Wide range of optional peripherals.




Fourth Generation of Computer (1975-1989)


The 1970’s marked the beginning of a new generation of computers, produced by computer giants like IBM, ICL, CNR and Burrought. From design viewpoint, the new generation provided increased input-output capability, longer component life as well as greater system reliability. From the functional view point, new powerful language were developed to broaden the use of multiprogramming and multiprocessing and major shift from batch processing to on line, remote interactive processing.


The development of microprocessor chip, which contains an entire Central Processing Unit(CPU) on a single silicon chip led to the mushroom growth of inexpensive computers. They are not computers by themselves but they can perform all the functions of arithmetic logic unit and control units of the CPU. When these microprocessor are connected with memory and input-output devices, they become microcomputers.

The use of very large integrated circuits (VLSI) has made the froth generation (micro) computers very compact, much less expensive, faster, more reliable and of much greater data processing capacity than equalized third generation computers.

Some computers belonging to fourth generation are DEC-10, STAR-1000, PDP-11 AND APPLE Series Personal computers.




Fifth Generation Computers (1989-Present)

Till fourth generation of computers, the major stress was on improving the hardware from values to transistors and then to integrated circuits, which resulted in miniaturization and fast speed of computers. Hardware, the lack of thinking power has forced the scientists to work further for fifth generation computers.

The concept of “Artificial Intelligence” is being used in these computers and Japanese call them “Knowledge Processors”. Automatic programming, computational logic, pattern recognition and control of robots, the processes and which need skill and intelligence are examples of Artificial Intelligence. These computers, when developed, will have be able to execute billion of instructions per second and will have unimaginable storage capacities. The present day high level languages will become obsolete on these machines and new computer language and related software will be needed.

Computers of this generation have the following characteristics:

1.Easy to computers with high intelligence and natural human input and output mechanism;

2.Reliable and efficient software development by new languages, new computer architectures and systems software which overcome previous problems;

3.Improved overall functions and performance aimed at making computers smaller, lighter, faster, faster, of greater capacity, more flexible and more reliable

Processor and Memory

Posted by Hollywood Updates

Introduction-

The processing unit in a computer interest instructions given in a programs and carries out the instructions. Processors are designed to interpret a specified number of instruction codes. Each instruction code is a string of binary digits. All processors have input/outputs instructions, arithmetic instructions, to manipulate characters. The number and type of instructions available differ from processor to processor.

A memory or store is required in a computer to store programs and the data processed by programs. A memory is made up of a large number of cells, with each cell capable of storing one bit. The cells may be organized as a set of addressable words, each word storing a sequence of bits. In one such address of the word. This organization, called a Random Access Memory (RAM), is used as the main memory of computers. Another organization arranges cells in a linear sequence to from a serial access memory.

The Central Processing unit

Posted by Hollywood Updates

The Central Processing unit is the brain of the computers system. The input and output devices may vary for different application, but there is only one CPU for a particular computer. The specifications of a computer are basically characterized by its Central Processing Unit.

The CPU processes the data it receives as input (either through input devices or through the memory). As mentioned earlier the CPU receives the data in the from of binary bits, which it can understand.

1. the CPU can perform arithmetic calculation such as addition, subtraction etc.
2. the CPU can perform logical decisions.
3. The CPU with the help of other devoices can perform data transmission.
4. The CPU can perform manipulating tasks such as word processing.
5. After performing the required task the CPU may place results in memory or send results to the output device according to the instruction given to it.



The central processing unit can be further divided into:


1. Arithmetic Logic Unit(ALU)-

As the name may indicate the arithmetic logic unit performs all arithmetic and logic calculations on the data it receives.


2. Arithmetic Calculations-

The arithmetic calculations may be addition, subtraction, multiplication, division, exponentiation etc.

3. Logical Calculation-

Logical calculations are basically decision making statements. For example, A>B, decides whether is A is greater B or not; IF A is greater than B the statement is true and logical ‘1’ would be generated, otherwise a logical ‘0’ would be generated. Some logical decide the further routing of the program.

http://wschools.blogspot.com
Labels: fundamentals of computer
Control Unit –



Control Unit –

The control unit controls the entire operations of the computer and the CPU. It controls all the other devices connected the CPU. Input devices, output devices, auxiliary Memory etc. hence, the control acts as the never centre of the computer.
The control unit upon receiving an instruction decides what is to be done with it. That is, whether it is to be sent to the ALU for further processing or to the output devices or to the memory etc. in other words the control unit coordinates and controls all hardware operations.

The control unit has an electronic clock that transmits electronic pulses at equal interval of time. The control unit gives instructions to other devices based upon these pulses. Suppose there are three instructions to be performed. Let the first instructions take three clock pulses to complete; when the fourth clock pulse is received the control unit would start processing the second instruction and so on. Suppose an instruction takes three and for the fourth clock pulse to complete and take up the next instruction with the fifth clock. Pulse.

Registers –

The CPU consists of a set of registers which are used for various operations during the execution of instructions. CPU needs registers for storing instructions as well as for storage and manipulation of temporary results.

1. Fetch: To bring the instructions from main memory into the instruction register.

2. Decode: Decoding means interpretation of the instruction to decide which course of action is to be taken for execution of the instruction and what sequence of control singles must be generated for it.



3. Execute: The instruction is execute after the fetching of operands is complete. The control unit is responsible for sequencing the steps necessary to complete.

Keyboard Devices

Posted by Hollywood Updates


Keyboard Devices:

Most input data is entered input the computer by using a keyboard. This input methods is similar to typing on a typewriters.

Most typewriters and computer keyboards are Qwerty Keyboards. The alphabetic keys are arranged so that the upper-left row of letters begins with the six letters QWERTY. Designers of other keyboards claim that their boards are easier to learn than the QWERTY keyboard.

Computer keyboards

Include keys that are designed to perform specific tasks. These special keys include function keys, directional keys and special-purpose keys such as Alt, Ctrl, Enter, Ins, and Esc. These keys enable the user to perform complex tasks easily when using the application.

Some of keyboards have even 112 keys, with three new keys designed to simplify working with Windows 98. two of these keys, next to the Alt key, bring up Start menu. The third key, next to the right Ctrl key, brings up a menu of functions that are frequently accessed in whichever application is currently being used.


Point and Draw Devices : Many people use devices instead of keyboards whenever possible. Pointing devices minimize the amount of typing (consequently, the number of errors). The many pointing devices available include the mouse, trackball, light pen, digitizing tablet, touch screen and pen-based systems.




The Mouse and Track Ball: The mouse is a palm-size device with a ball built into the button. The mouse is usually connected to the computer by the cable (computer wires are frequently called cables) and many have from one to four buttons (usually two). The mouse may be mechanical or optical and comes in many shapes and sizes. When you move the mouse over a smooth surface, the calls rolls, and the pointer on the display screen moves in the same direction. The Apple Macintosh, with its graphical user interface, made the mouse popular. Now, most microcomputer systems, regardless of the manufacturer, use a mouse. With the mouse, you can draw, select options from a menu and modify or move text. You issue commands by pointing with the pointer and clicking a operating a microcomputer easier for beginning users.



Touchpad:

The touch pad is a stationary pointing devices that people find less tiring to use than a mouse or a track ball. The movement of a finger across a small touch surface is translated into cursor movement on the computer screen. The touch sensitivity surface may be just 1.5- 2 inch square, so the finger does not have to move much. Its size makes it most suitable for the notebooks or the laptops.

A device

That was released in 1995 enables the user to move the cursor using and infrared pen. The pen is cordless and works when it is as fifteen feet from the screen. Although the mouse is still the most pointing device, these innovations may change that in future.



Joysticks:
A joysticks is a pointing device often used for playing games. The joystick has a gearshift-like lever that is used to move the pointer on the screed. On most joysticks, a button on the top is used to select options. In industry and manufacturing, joysticks are used to control robots. Flight simulators and other training simulators also use joysticks.


Touch-Sensitive Screens:

Perhaps the easiest way to enter data is with the touch of finger. Touch screens enable the user to select and option by pressing a specific part of the screen. Touch screens are commonly used in grocery stores, fast-food restaurants and information kiosks.

Data Scanning Devices

Posted by Hollywood Updates


Posted by fateh

Data Scanning Devices



Optical Recognition System:

Optical Recognition System provide another means of minimizing keyed input by capturing data at the source. These systems enable the computer to “red” data by scanning printed text for recognizable patterns.

The banking industry developed one of the earliest scanning systems in the 1950’s for processing cheque. The Magnetic link Character Recognition (MICR) system is still used throughout the banking industry. The bank, branch, account number and cheque number are encoded on the cheque before it is sent to the customer. After the customer has used the cheque and it comes back to the bank to the bank, all that needs to be entered manually is the amount. MICRO has not been adopted by other industries because the character set has only fourteen symbols.



Bar Code Reader:

Of all the scanning devices, you are probably most familiar with BAR CODE READES. Many retail and grocery stores use some from of bar code reader to determine the item being sold and to retrieve the item price from a computer system. The code reader may be a handled unit or it may be embedded in a countertop. The bar code reader reads the Universal Product Code(UPC), a pattern of bars printed on merchandise. The UPS has gained the use of the code because the system was used to check their accuracy and speed. Today, bar codes are used to update inventory and ensure correct pricing. Federal Express employees can usually tell a customer within a matter of minutes the location of any package.


Optical Mark Reader:

By taking exams, you are familiar with Mark Sense Character Recognition systems. Every time you take a test with a “fill in the bubble” Scranton from and use a #2 lead pencil, you are creating input suitable for and OPTICAL MARK READER (ORMO. A #2 lead pencil works best because of the number of magnetic particles in that weight lead. The OMR sense the magnetized marks, enabling the reader to determine which responses are marked.

Optical Scanners:

Can scan typed documents, pictures, graphics or even handwriting into a computer. Photographs scanned into a microcomputer appear clearly on the screen and can be displayed whenever desired. The copy that the computer stores never yellows with age. Early scanners could recognize only text printed in a special OPTICAL CHARACTER RECOGNITION(OCR) typeface. A scanner converts the image that it sees into numeric digits before storing it in the computer. This conversion process in known as DIGIZING.

Depending on the volume and type of material to be scanned, you can use drum scanner, flatbed scanner, sheeted scanner and even small handheld scanners. The small, hadheld scanner sheeted scanners(priced at about $150) are used most frequently with micro microcomputer; however, only 5 per cent of all microcomputer systems are equipped with scanners. Manufacturers responded to user reluctance to use scanners by releasing in 1995 a number of new, small paper scanners priced between $500 and $700.) most of these new devices sit between the keyboard and the monitor and can interface with a fax machine, send e-mail, and store documents on disk for archive purposes.

Digitizer:

Digitizer is used to create drawing and pictures using a digitizer tablet by a process called digitizing. Digitizing is a process by which graphic representations are converted into digitizer consist of 3 main parts – a flat surface called tablet, a small hand held mouse-like device called puck and a special pen like device called stylus. The puck is used to input existing drawings into the computer. The stylus is used to trace exiting drawing placed on the tablet. The user makes contact to the tablet with stylus. As the stylus is connected to the tablet by a write, the traced image is stored in RAM and displayed on the monitor.


Electronic Card Reader:

Before discussing electronic card reader, let we discuss electronic credit cards. Electronic credit card make it possible to charge online payments to one’s credit card account. It is card details can be encrypted by using the SSL. Protocol in the buyer’s computer, which is available in standard browsers. The number of input devices are common in association with ash transactions. The most common are ATMs and POS terminals.

1. ATM:

Automatic Taller Machine are interactive input/output devices the enable people to make bank transactions from remote locations. ATMs utilize screen input as well as magnetic card readers.

2. POS:
Point of Sale: terminals are computerized cash registers that also often incorporate touch screen technology and bar-code scanners. These devices allow the input of numerous data such as item sold, price, method of payment, name or Zip code of the buyer, and so on. Some inputs are automated; others may be entered by the operators.

Vision Input System:

Are the latest input devices that can recognize the vision/image which appears in the range of its lens. It seems to be vary useful and are becoming popular in different Got. Departments like licensing, passport department and other authorities where personal identification is required.

Printer

Posted by Hollywood Updates

Printer is the most important output device, which is used to print information on paper. Printers are essential for getting output of any computer based application.

Types of Printers:

There are many types of computer which are classified on various criteria. Printers can be broadly categorized into the following two types:


a. Impact Printers:

The characters by striking against the ribbon and onto the paper, are called Impact Printers. These printers are of two types –


1. Character Printer. 2. Line Printers.




b. Not Impact Printers:

The printers that print the characters without striking against the ribbon and onto the paper, are called Non-Impact Printers. These printers print a complete page at a time, therefore, also Printers.

1. Ink Jet Printers. 2. Thermal Printers.

Plotter

Posted by Hollywood Updates

Plotter
is an important output device. Used to print high quality graphics and drawings. Although the graphics can be printed on printers, the resolution of such printing is limited on printers. Plotters are generally used for printing/drawing graphical images such as charts, drawings, maps. Of engineering and scientific application.

Some important types of printers are:

a.) Flat Bed Plotters:
These plotters print the graphical images by moving the pen on stationary flat surface material. They produce very accurate drawings.

b.) Drum Plotters:
These plotters print graphical images by moving both the pen and the drum having paper. They do not produce as accurate drawings as printed by flat bed plotters.

c.)Inkjet Plotters: These plotters use inkjet in place of pens. They are faster than flat bed plotters and can print multi-colored large drawings.

Overview of Computer Architecture

Posted by Hollywood Updates

Your computer system consists of thousands of individual components that work in harmony to process data. Each of these components has its own job to perform, and each has its own performance characteristics.

The brainpower of the system is the Central Processing Unit(CPU), which processes all the calculations and instructions that run on the computer. The job of the rest of the system is to keep the CPU busy with instructions to process. A well-tuned system runs at maximum performance if the CPU or CPU are busy 100% of the time.

So how does the system keep the CPUs busy? In general, the system consists of different layers, or tiers, of progressively slower components. Because faster components are typically the most expensive, you must perform a balancing act between speed and cost efficiency.

CPU and Cache


The CPU and the CPU’s cache are the fastest components of the system. The cache is high-speed memory used to store recently used data instructions so that it can provide quick access if this data is used again in a short time. Most CPU hardware designs have a cache built into the CPU chip. This internal cache is known as a Level 1 (or L1) cache. Typically, an L1 cache is quite small-8-16KB.

When a certain piece of data is wanted, the hardware looks first in the L1 cache. If the data is there, it’s processed immediately. If the data is not available in the cache, the hardware looks in the L2 cache, which is external to the CPU chip but located close to it. The L2 cache is connected to the CPU chip(s) on the same side of the memory bus as the CPU. To get to main memory, you must use the memory bus, which affects the speed of the memory access


CPU Design


Most instructions processing occurs in the CPU. Although certain intelligent devices, such as disk controllers, cam process some instruction, the instructions these devices can handle are limited to the control of dada moving to and from the devices. The CPU works from the determine how quickly these instructions are executed.
The CPU usually falls into one of two groups process:

Complex Instructions Set Computer(CISC) or
Reduced Instructions Set Computer(RISC).

Computer Science Lectures

Posted by Hollywood Updates

Greetings everyone! A new month and a new post on free science online. This month I have a bunch of computer science video lectures.

Video lectures include: basics of computation theory, intro to computer science, data structures, compiler optimization, intro to computers and internet, intro to clojure, and some videos from EECS colloquium at Case Western Reserve University.


Higher Computing (University of New South Wales, by Richard Buckland, COMP1917)


Higher Computing Video Lectures


Course description:
This is the introductory course for computer science at UNSW. This course consists of three strands: programming, systems, and general computer-science literacy. The programming strand is further divided into two parts. For the first half of the course we cover small scale programming, in the second half we look at how to effectively use teams to produce more substantial software. In the systems strand we will look at how computers work. Concentrating on microprocessors, memory, and machine code. In the literacy strand we will look at topics drawn from: computing history, algorithms, WWW programming, ethics and law, cryptography and security, and other topics of general interest. The strands will be covered in an intermingled fashion.

Course topics:
Higher Computing. Inside a computer. Machine Code. Simple C Program. Clarity (C programming #2). Solving Problems. Side Effects. A simple recursive function. The Amazing Alan Turing. The Turing Test. Frames. Arrays. Pass by reference. Game design. Everything you need to know about pointers. Sudoku solver. Stack Frames. eXtreme Programming. VS programming. Programming in the Large. Stress. Random Numbers. The Trouble with Concrete Types. Abstract Data Types in C. Blackadder and Baldrick. ADT. Steganography (hidden messages). Don't give up. File I/O. Linked lists. Experimenting with CMOS. Complexity & Trees. Errors, Risks, Snarks, Boojums. Taste of Graphics. Sample Tree Code: loop detection. Ethics. Hamming Error Correcting Code. Professionalism. What makes a good programmer? Learning and Teaching Computing. Coding samples.


Introduction to Computer Science (Harvard, professor David J. Malan)


Intro to Comp. Sci. Video Course


Course description:
Introduction to Computer Science I is a first course in computer science at Harvard College for concentrators and non-concentrators alike. More than just teach you how to program, this course teaches you how to think more methodically and how to solve problems more effectively. As such, its lessons are applicable well beyond the boundaries of computer science itself. That the course does teach you how to program, though, is perhaps its most empowering return. With this skill comes the ability to solve real-world problems in ways and at speeds beyond the abilities of most humans.

Course topics:
How Computers Work, Binary. Introduction to Programming and Scratch. Threads and Programs with Multiple Scripts. Binary Numbers, Programming Languages, Working in Linux, and Programming in C. Secure File Transfer, Variable Types, and Arithmetic Operators. Standard Input Functions, Boolean Expressions, and Loops. Cryptography, Bugs, Integer Casting, and Functions. Local and Global Variables, the Stack, Return Values, and Arrays. Strings as Arrays, Command-Line Arguments, and more Cryptography. Run Times and Algorithms, Recursion. Sorting: Bubble Sort, Selection Sort, and Merge Sort. Hardware, Processors, and Implications for Software. Greedy Algorithms, Software Design and Debugging. Pointers. Pointers and Arrays, Dynamic Memory Allocation. Pointer Arithmetic, Structures, File I/O. Linked Lists. Inserting and Deleting Elements in Linked Lists, Doubly-Linked Lists. Hash Tables, Dealing with Collisions. Pointers to Pointers, Binary Search Tree, Tries, Heaps. Heapsort, Jeopardy. Huffman Coding Theory. Bitwise Operators, Underneath the Hood - From Code to Executable File. Dangerous Functions, Secure Code. The Internet and Webpages - HTTP and XHTML. Introduction to PHP. User Input, Setting up a Login Page, SQL. Threats. Introduction to LISP. Brief Introduction to System Programming and Machine Organization. Conclusions.


Data Structures (Berkeley, professor Paul Hilfinger)


Data Structures Video Lectures


Course description:
Fundamental dynamic data structures, including linear lists, queues, trees, and other linked structures; arrays strings, and hash tables. Storage management. Elementary principles of software engineering. Abstract data types. Algorithms for sorting and searching. Introduction to the Java programming language.

Course topics:
Developing a Simple Program. More on Simple Programs. Values and Containers. Simple Pointer Manipulation. Arrays and Objects. Object-Oriented Mechanisms. Interfaces and Abstract Classes. Abstract Methods and Classes, Continued. Examples of Interfaces. Misc. Support for Abstraction; Exceptions. Numbers. Algorithmic Analysis. Collections Overview. Paradox of Voting. Resource Curse. Getting a View - Sublists. Data Structures Exam Review. Trees. Trees, Searching. Generic Programming. Priority Queues, Range Queries. Hashing. Sorting. Balanced Search Structures. Pseudo-Random Sequences. Backtracking Search, Game Trees. Enumeration Types, Threads, and Concurrency. Graphs, Introduction. Graphs, Minimal Spanning Trees, Union-find. Dynamic Programming. Storage Management. Storage Management, Continued, Reflection. Data Structures Course Summary.


Compiler Optimization


Compiler Optimization Video Lectures


Course description:
This course introduces students to modern techniques in efficient implementation of programming languages. Modern processors and systems are designed based on the assumption that a compiler will be able to effectively exploit architectural resources. This course will examine in detail techniques to exploit instruction level parallelism, memory hierarchy and higher level parallelism. It will examine classic static analysis approaches to these problems and introduce newer feedback directed and dynamic approaches to optimisation.

Course topics:
Scalar Optimisation - Redundant Expressions. Scalar Optimisation - Dataflow Framework and SSA. Code Generation. Instruction Scheduling. Register Allocation. Dependence Analysis. Program Transformations. Vectorisation. Parallelisation. Adaptive and Profile Directed Compilation. Iterative + Dynamic Compilation. Dynamic Compilation. Machine Learning based Compilation.


Understanding Computers and the Internet


Understanding Computers and the Internet Videos


Video course description:
This course is all about understanding: understanding what's going on inside your computer when you flip on the switch, why tech support has you constantly rebooting your computer, how everything you do on the Internet can be watched by others, and how your computer can become infected with a worm just by turning it on. In this course we demystify computers and the Internet, along with their jargon, so that students understand not only what they can do with each but also how it all works and why. Students leave this course armed with a new vocabulary and equipped for further exploration of computers and the Internet. Topics include hardware, software, the Internet, multimedia, security, website development, programming, and dotcoms. This course is designed both for those with little, if any, computer experience and for those who use a computer every day.

Course topics:
Hardware - Computation. Overview. Bits and bytes. ASCII. Processors. Motherboards: buses, connectors, ports, slots, and sockets. Memory: ROM, RAM, and cache. Secondary storage: floppy disks, hard disks (PATA and SATA), CDs, and DVDs. Virtual Memory. Expansion buses and cards: AGP, ISA, PCI, PCI Express, and SCSI. I/O devices. Peripherals. How to shop for a computer. History. The Internet - Networks: clients and servers, peer-to-peer, LANs and WLANs, the Internet, and domains. Email: addresses; IMAP, POP and SMTP; netiquette; spam; emoticons; snail mail; and listservs. SSH. The World Wide Web: URLs and HTTP. Blogs. Instant messaging. SFTP. Usenet. Network topologies. The Internet: backbones, TCP/IP, DHCP, and DNS. NAT. Ethernet: NICs, cabling, switches, routers, and access points. Wireless: IR, RF, Bluetooth, and WiFi. ISPs. Modems: dialup, cable, and DSL. Multimedia - Graphics: file formats, bitmaps and vectors, and compression. Audio: file formats and compression. Video (and audio): file formats and compression. Streaming. Security - Threats to privacy: cookies, forms, logs, and data recovery. Security risks: packet sniffing, passwords, phishing, hacking, viruses and worms, spyware, and zombies. Piracy: WaReZ and cracking. Defenses: scrubbing, firewalls, proxy servers, VPNs, cryptography, virus scanners, product registration and activation. Website Development - Webservers: structure, permissions, and implementations. Static webpages: XHTML, well-formedness, and validity. Dynamic webpages: SSIs, DHTML, AJAX, CGI, ASPs, and JSPs. Programming - Pseudocode. Constructs: instructions, variables, conditions, branches, and loops. Languages: interpreted and compiled. Scratch.

Free Computer Science Video Lecture Courses

Posted by Hollywood Updates

Here is a list of video lectures in computer science I had collected over the years.
This list is only two-thirds of all links I have in my bookmarks, I will go through the rest of links later. Check back.

For formal computer science education here is an overview of a bachelor degree in computer science.


Web Applications

Video Lectures at ArsDigita University
Mirror ar ArsDigita
High Speed Mirror at Internet Archive
Course website

Teaches basics of designing a dynamic web site with a database back end, including scripting languages, cookies, SQL, and HTML with the goal of building such a site as the main (group) project Emphasizes computer-human interface and the graphical display of information.


Structure and Interpretation of Computer Programs

Video lectures at MIT
Structure and Interpretation of Computer Programs has been MIT's introductory pre-professional computer science subject since 1981. It emphasizes the role of computer languages as vehicles for expressing knowledge and it presents basic principles of abstraction and modularity, together with essential techniques for designing and implementing computer languages. This course has had a worldwide impact on computer science curricula over the past two decades.


Structure and Interpretation of Computer Programs (a different course)

Video lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

An introduction to programming and the power of abstraction, using Abelson and Sussman's classic textbook of the same name. Key concepts include: building abstractions, computational processes, higher-order procedures, compound data, data abstractions, controlling interactions, generic operations, self-describing data, message passing, streams and infinite data structures, meta-linguistic abstraction, interpretation of programming languages, machine model, compilation, and embedded languages.


Structure and Interpretation of Computer Programs (a different course)

Video Lectures: CS61A (Berkeley)
Course website

The CS 61 series is an introduction to computer science, with particular emphasis on software and on machines from a programmer's point of view. This first course concentrates mostly on the idea of abstraction, allowing the programmer to think in terms appropriate to the problem rather than in low-level operations dictated by the computer hardware. The next course, CS 61B, will deal with the more advanced engineering aspects of software on constructing and analyzing large programs and on techniques for handling computationally expensive programs. Finally, CS 61C concentrates on machines and how they carry out the programs you write.
In CS 61A, we are interested in teaching you about programming, not about any particular programming language. We consider a series of techniques for controlling program complexity, such as functional programming, data abstraction, object-oriented programming, and query systems. To get past generalities you must have programming practice in some particular language, and in this course we use Scheme, a dialect of Lisp. This language is particularly well-suited to the organizing ideas we want to teach. Our hope, however, is that once you have learned the essence of programming, you will find that picking up a new programming language is but a few days' work.


Data Structures

Video Lectures: CS61B (Berkeley)
Course website

The CS 61 series is an introduction to computer science, with particular emphasis on software and on machines from a programmer’s point of view. CS 61A covered high-level approaches to problem-solving, providing you with a variety of ways to organize solutions to programming problems: as compositions of functions, collections of objects, or sets of rules. In CS 61B, we move to a somewhat more detailed (and to some extent, more basic) level of programming. As in 61A, the correctness of a program is important. In CS 61B, we’re concerned also with engineering. An engineer, it is said, is someone who can do for a dime what any fool can do for a dollar. Much of 61B will be concerned with the tradeoffs in time and memory for a variety of methods for structuring data. We’ll also be concerned with the engineering knowledge and skills needed to build and maintain moderately large programs.


Machine Structures

Video Lectures: CS61C (Berkeley)
Course webpage

The subjects covered in this course include C and assembly language programming, how higher level programs are translated into machine language, the general structure of computers, interrupts, caches, address translation, CPU design, and related topics. The only prerequisite is that you have taken Computer Science 61B, or at least have solid experience with a C-related programming language.


Programming Languages

Video Lectures: CSEP505 (University of Washington)
Course website

Goals: Successful course participants will:
• Master universal programming-language concepts (including datatypes, functions, continuations, threads,
macros, types, objects, and classes) such that they can recognize them in strange guises.
• Learn to evaluate the power, elegance, and definition of programming languages and their constructs
• Attain reasonable proficiency programming in a functional style
• Find relevant literature somewhat more approachable.


Principles of Software Engineering

Video Lectures: CS584 (University of Washington)

Course website

Study of major developments in software engineering over the past three decades. Topics may include design (information hiding, layering, open implementations), requirements specification (informal and formal approaches), quality assurance (testing, verification and analysis, inspections), reverse and re-engineering (tools, models, approaches).


Object Oriented Program Design

Video lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

The concepts of the Object-oriented paradigm using Java. The basic principles of software engineering are emphasized. We study how to design and think in an object oriented fashion.


Algorithms

Video lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

The design and analysis of algorithms is studied. Methodologies include: divide and conquer, dynamic programming, and greedy strategies. Their applications involve: sorting, ordering and searching, graph algorithms, geometric algorithms, mathematical (number theory, algebra and linear algebra) algorithms, and string matching algorithms.

We study algorithm analysis - worst case, average case, and amortized, with an emphasis on the close connection between the time complexity of an algorithm and the underlying data structures. We study NP-Completeness and methods of coping with intractability. Techniques such as approximation and probabilistic algorithms are studied for handling the NP-Complete problems.


Introduction to Algorithms

Video lectures: 6.064J/18.410J (MIT)
Course homepage

This course teaches techniques for the design and analysis of efficient algorithms, emphasizing methods useful in practice. Topics covered include: sorting; search trees, heaps, and hashing; divide-and-conquer; dynamic programming; amortized analysis; graph algorithms; shortest paths; network flow; computational geometry; number-theoretic algorithms; polynomial and matrix calculations; caching; and parallel computing.


Systems

Video Lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

Topics on the engineering of computer software and hardware systems: techniques for controlling complexity, system infrastructure, networks and distributed systems, atomicity and coordination of parallel activities, recovery and reliability, privacy of information, impact of computer systems on society. Case studies of working systems and outside reading in the current literature provide comparisons and contrasts.


Computer System Engineering

Video Lectures: 6.033 (MIT) (first 3 lectures don't have videos)
Course homepage

This course covers topics on the engineering of computer software and hardware systems: techniques for controlling complexity; strong modularity using client-server design, virtual memory, and threads; networks; atomicity and coordination of parallel activities; recovery and reliability; privacy, security, and encryption; and impact of computer systems on society. We will also look at case studies of working systems and readings from the current literature provide comparisons and contrasts, and do two design projects.


Graduate Computer Architecture

Video Lectures: CS 252 (Berkeley)
Course website

This course focuses on the techniques of quantitative analysis and evaluation of modern computing systems, such as the selection of appropriate benchmarks to reveal and compare the performance of alternative design choices in system design. The emphasis is on the major component subsystems of high performance computers: pipelining, instruction level parallelism, memory hierarchies, input/output, and network-oriented interconnections.


Computer Architecture

Video Lectures: CSE P 548 (University of Washington)

Course website
The purpose of this course is to give you a broad understanding of the concepts behind several advanced microarchitectural features in today’s microprocessors and to illustrate those concepts with appropriate (usually modern) machine examples. We will cover the rationale for and the designs of strategies for instruction sets, dynamic branch prediction, multiple-instruction issue, dynamic (out-of-order) instruction scheduling, multithreaded processors, shared memory multiprocessors, and, if there is time, dataflow machines. Some of these topics require some understanding from what is normally thought of as undergraduate material; for these, we’ll briefly review that material, and then go on from there.

You will augment your knowledge of the architectural schemes by doing experimental studies that examine and compare the performance of several alternative implementations for a particular feature. Here you will learn how to design architectural experiments, how to choose metrics that best illustrate a feature’s performance, how to analyze performance data and how to write up your experiment and results - all skills computer architects, and, actually, researchers and developers in any applied subfield of computer science, use on a regular basis.


Operating Systems and System Programming

Video Lectures: CS 162 (Berkeley)
Course website
The purpose of this course is to teach the design of operating systems and other systems. Topics we will cover include concepts of operating systems and systems programming; utility programs, subsystems, multiple-program systems; processes, interprocess communication, and synchronization; memory allocation, segmentation, paging; loading and linking, libraries; resource allocation, scheduling, performance evaluation; I/O systems, storage devices, file systems; basic networking, protocols, and distributed file systems, protection, security, and privacy.


How Computers Work

Video lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

Includes the basics of digital logical design, computer organization and architecture including assembly language, processor design, memory hierarchies and pipelining. Students examine the detailed construction of a very simple computer. A higher level view of a modern RISC architecture is studied, using the Patterson and Hennessey introductory text, from both the programmer's point of view and the hardware designer's point of view. The distinction between RISC and CISC architectures is emphasized.


Performance Analysis

Video Lectures: CSE 597 (University of Washington)

Course website

This course is intended to provide a broad introduction to computer system performance evaluation techniques and their application. Approaches considered include measurement/benchmarking, stochastic and trace driven simulation, stochastic queueing networks, and timed Petri


Database Management Systems

Video Lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

A more formal approach to Relational Database Management Systems, compared the way they were covered during Web Applications. Database systems are discussed from the physical layer of B-trees and file servers to the abstract layer of relational design. Also includes alternative and generic approaches to database design and database management system including relational, object-relational, and object-oriented systems, SQL standards, algebraic query languages, integrity constraints, triggers, functional dependencies, and normal forms. Other topics include tuning database transactions, security from the application perspective, and data warehousing.


Database Management Systems

Video Lectures: CSEP544 (University of Washington)
Course website

Databases are at the heart of modern commercial application development. Their use extends beyond this to many applications and environments where large amounts of data must be stored for efficient update and retrieval. The purpose of this course is to provide an introduction to the design and use of database systems, as well as an appreciation of the key issues in building such systems, and working with multiple database systems.
We begin by covering basis aspcts of SQL, and illustrating several data management concepts through SQL features (e.g., views, constraints and triggers). Next, we study conceptual database design and normalization theory. We then study management of XML data, and cover the XPath and XQuery languages. We consider the issues arising in data integration from multiple databases, and more generally, issues in managing meta-data. Finally, we cover the basic aspects of the internals of database systems.


Transaction Processing for E-Commerce

Video Lectures: CSEP545 (University of Washington)

Course website

Course covers Database Concurrency Control, Database Recovery, Basic Application Servers, Two-Phase Commit, Queuing, Replication, Application Servers.


Practical Aspects of Modern Cryptography

Video Lectures: 950TU (University of Washington)
Course website
Course covers Symmetric Key Ciphers and Hashes, Public Key Ciphers, Analysis of Block Ciphers, AES and Attacks on Cryptographic Hashes, Certificates, Trust & PKI, Public Key Cryptography, Digital Rights Management, The Politics of Cryptography


Theory of Computation

Video Lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

A theoretical treatment of what can be computed and how fast it can be don. Applications to compilers, string searching, and control circuit design will be discussed. The hierarchy of finite state machines, pushdown machines, context free grammars and Turing machines will be analyzed, along with their variations. The notions of decidability, complexity theory and a complete discussion of NP-Complete problems round out the course.


Artificial Intelligence (4 lectures)

Video Lectures at ArsDigita University
Mirror at ArsDigita
High Speed Mirror at Internet Archive
Course website

An quick overview of AI from both the technical and the philosophical points of view. Topics discussed include search, A*, Knowledge Representation, Neural Nets.


Applications of Artificial Intelligence

Video Lectures: CSE592 (University of Washington)

Course website
Introduction to the use of Artificial Intelligence tools and techniques in industrial and company settings. Topics include: foundations (search, knowledge representation) and tools such as expert systems, natural language interfaces and machine learning techniques.



Related Posts


Google Tech-Talk Computer Science Video Lectures
(Lectures on theoretical and practical aspects of computer science such as: creative commons licensing issues, grid clusters, debian linux testing, python programming language, computer security, networking, click fraud, reusable software components, ruby programming language, privacy, service monitoring)


More Mathematics and Theoretical Computer Science Video Lectures
(Includes algebra, elementary statistics, applied probability, finite mathematics, trigonometry with calculus, mathematical computation, pre-calculus, analytic geometry, first year calculus, business calculus, mathematical writing (by Knuth), computer science problem seminar (by Knuth), dynamic systems and chaos, computer musings (by Knuth) and other Donald E. Knuth lectures)


Computer Science Lectures
(Courses include higher computing (intro to theory of computation), intro to computer science, data structures, compiler optimization, computers and internet, intro to clojure, the akamai story, cryptography, EECS colloquium videos at Case Western Reserve University)


Computer Science Courses
(Includes introduction to computer science and computing systems, computational complexity and quantum computing, the c programming language, multicore programming, statistics and data mining, combinatorics, software testing, evolutionary computation, deep learning, data structures and algorithms and computational origami.)

Computer Science Video Courses

Posted by Hollywood Updates

This time I bring you a mixture of video lectures and full video courses for computer science undergraduates and graduates.

Courses come from engineering, mathematics and comp. sci. departments of various universities.

This post can be considered a follow-up on posts on computer science lectures: 1, 2, 3, 4, 5, 6, 7, 8 .

See the right menu bar for more lectures specifically in engineering and mathematics!


Introduction to Computer Graphics (35 lectures)


Intro to Computer Graphics Video Lectures


Course topics:
1. Introduction. 2, 3. Raster Graphics. 4. Clipping. 5. Polygon Clipping and Polygon Scan Conversion. 6, 7. Transformations. 8, 9. 3D Viewing. 10-15. Curves. 16-19. Surfaces. 20. Hierarchical Models. 21-23. Rendering. 24-27. Ray Tracing. 28, 29. Hidden Surface Elimination. 30-32. Fractals. 33-35. Computer Animation.


Computer Graphics (43 Video Lectures)


Computer Graphics Video Lectures


Course topics:
1. Introduction to computer graphics. 2-5. CRT Display Devices. 6. Transformations. 7. Transformations in 2D. 9, 10. Three Dimensional Graphics. 11. Project Transformations and Viewing Pipeline. 12. 3D Viewing. 13-17. Scan Converting Lines, Circles and Ellipses. 18, 19. PolyFill Scan Conversion of a Polygon. 20-22. Clipping: Lines And Polygons. 23-25. Solid Modelling. 26-32. Visible Surface Detection. 33-35. Illumination and Shading. 36, 37. Curve Representation. 38. Curves and Surface Representation. 39. Graphics Programming. 40. Graphics Programming Using OpenGL. 41. Advanced Topics. 42, 43. Digital Image Processing.


Discrete Mathematics for Programmers (40 Video Lectures)


Discrete Maths Video Lectures


Course topics:
1, 2. Propositional Logic. 3, 4. Predicates & Quantifiers. 5. Logical Inference. 6. Resolution Principles & Application to PROLOG. 7. Methods of Proof. 8. Normal Forms. 9. Proving programs correct. 10. Sets. 11. Mathematical Induction. 12. Set Operations on Strings Over an Alphabet. 13. Relations. 14, 15. Graphs. 16. Trees. 17. Trees and Graphs. 18. Special Properties of Relations. 19, 20. Closure of Relations. 21. Order Relations. 22. Order and Relations and Equivalence Relations. 23. Equivalence relations and partitions. 24-26. Functions. 27-29. Permutations and Combinations. 30-31. Generating Functions. 32-34. Recurrence Relations. 35-37. Algebras. 38-39. Finite State Automaton (FSA). 40. Lattices.


Mathematical Analysis (32 Video Lectures)


Math Analysis Video Lectures


Course topics:
1. Real Number. 2. Sequences I. 3. Sequences II. 4. Sequences III. 5. Continuous Function. 6. Properties of continuous functions. 7. Uniform Continuity. 8. Differentiable function. 9. Mean Value Theorems. 10. Maxima - Minima. 11. Taylor's Theorem. 12. Curve Sketching. 13. Infinite Series I. 14. Infinite Series II. 15. Tests of Convergence. 16. Power Series. 17. Riemann integral. 18. Riemann Integrable functions. 19. Applications of Riemann Integral. 20. Length of a curve. 21. Line integrals. 22. Functions of several variables. 23. Differentiation. 24. Derivatives. 25. Mean Value Theorem. 26. - Maxima Minima. 27. Method of Lagrange Multipliers. 28. Multiple Integrals. 29. Surface Integrals. 30. Green's Theorem. 31. Stokes Theorem. 32. Guass Divergence Theorem.


Introduction To Problem Solving and Programming (23 Video Lectures)


Problem Solving Video Lectures


Topics include:
Introduction to Computers, Algorithms and Programming. Data Types, Strings, and Input and Output. Flow Control. Arrays. Data Structures. Dynamic Data Structures. Recursion. Various Mathematics Problems and their solutions.


Computing I (CSCI 230, Indiana and Purdue Universities)


Computing Video Lectures


Course topics:
Introduction to Computers - Hardware and Languages. C Programming. Basic Input/Output. Variable Declarations, Data Types, Expressions. Program Control. Functions. Information Representation. Arrays. Programming Ethics. Pointers. Characters and Strings. Structures. Models of Computation.


Computing II (CSCI 240, Indiana and Purdue Universities)


Computing Video Lectures


Course topics:
Advanced Programming. C++ Programming. Concepts in Object-Oriented Design. Classes. Stream Input/Output Basics. Abstract Data Types. Operator Overloading. Inheritance. Virtual Functions. Exception Handling. Templates. Standard Template Library (STL). History of Graphical User Interfaces. Introduction To Visual Programming. QT Designer. Boolean Algebra. Digital Logic. Analysis of Algorithms. Elementary Data Structures. Recursion. Abstract Data Types. Elementary Sorting Algorithms. Quicksort. Mergesort.


Numerical Methods and Programing (38 Video Lectures)


Numerical Methods Video Lectures


Course topics:
1. Programing Basics. 2. Introduction to Pointers. 3. Pointers And Arrays. 4. External Functions and Argument Passing. 5. Representation of Numbers. 6. Numerical Error. 7. Error Propagation and Stability. 8. Polynomial Interpolation I. 9. Polynomial Interpolation II. 10. Error In Interpolation Polynomial. 11. Polynomial Interpolation. 12. Cubic Spline Interpolation. 13, 14. Data Fitting : Linear Fit. 15. - Data Fitting : Non Linear Fit. 16. Matrix Elimation and Solution. 17. Solution To Linear Equations. 18. Matrix Elimination. 19. Eigen Values of A Matrix. 20. Eigen Values And Eigen Vectors. 21. Solving NonLinear Equations. 22. Solving NonLinear Equations with Newton Method. 23. Methods For Solving NonLinear Equations. 24. System of NonLinear Equations. 25. Numerical Derivations. 26. High order Derivatives From Difference Formula. 27. Numerical Integration - Basic Rules. 28. Comparison of Different Basic Rules. 29. Gaussian Rules. 30. Comparison of Gaussian Rules. 31. Solving Ordinary Differential Equations. 32. Solving ordinary differential equations. 33. Adaptive step size Runge Kutta scheme. 34. Partial Differential Equations. 35. Explicit and Implicit Methods. 36. Nicholson Scheme For Two Spatial Dimensions. 37. Fourier Transforms. 38. Fast Fourier Transforms.


Convex Optimizations (EE364-a, Stanford University)


Course Website
Convex Optimizations Video Lectures


Read by professor Stephen Boyd in Winter Quarter 2007/2008.

Course description:
Concentrates on recognizing and solving convex optimization problems that arise in engineering. Convex sets, convex functions, and convex optimization problems. Basics of convex analysis. Least-squares, linear and quadratic programs, semidefinite programming, minimax, extremal volume, and other problems. Optimality conditions, duality theory, theorems of alternative, and applications. Interior-point methods. Applications to signal processing, control, digital and analog circuit design, computational geometry, statistics, and mechanical engineering.

Available lecture notes on topics:
Convex sets, Convex functions, Convex optimization problems, Duality, Approximation and fitting, Statistical estimation, Geometric problems, Numerical linear algebra background, Unconstrained minimization, Equality constrained minimization, Interior-point methods. Convex optimization examples, Filter design and equalization, Disciplined convex programming and CVX.


Introduction to Linear Dynamical Systems (EE263, Stanford University)


Course website
Linear Dynamical Systems Video Lectures


Read by professor Stephen Boyd in Autumn Quarter 2007/2008.

Course description:
Introduction to applied linear algebra and linear dynamical systems, with applications to circuits, signal processing, communications, and control systems. Topics include: Least-squares aproximations of over-determined equations and least-norm solutions of underdetermined equations. Symmetric matrices, matrix norm and singular value decomposition. Eigenvalues, left and right eigenvectors, and dynamical interpretation. Matrix exponential, stability, and asymptotic behavior. Multi-input multi-output systems, impulse and step matrices; convolution and transfer matrix descriptions. Control, reachability, state transfer, and least-norm inputs. Observability and least-squares state estimation.


Intelligent Systems Control (32 Video Lectures)


Intelligent Systems Control Video Lectures


Course topics include:
Methods for the system analysis. Design of intelligent systems for control. Programming of computation of intelligent systems, and their communications. Numerical methods and optimization. Linear programming. Discrete programming. Dynamic (dynamical) programming. System learning. Reinforcement learning. Optimal solution estimations. Bayesian networks. Neural Networks. Expert systems. Genetic Algorithms.


Programming Systems Seminar Series

See the The Intel Research Berkeley Programming Systems Seminar Series for more information!


Towards a Memory Model for C (by Hans Boehm)
Video Lecture - Low Speed (90 MB) or High Speed (600 MB)
Lecture Notes


Software and the Concurrency Revolution (by Herb Sutter)
Video Lecture
Lecture Notes


Static Extended Checking for Cyclone (by Greg Morrisett)
Video Lecture
Lecture Notes


Design and Implementation of Python (by Guido van Rossum)
Video Lecture
Lecture Notes


Parallel Programming and Code Selection in Fortress (by Guy Steele)
Video Lecture
Lecture Notes


How Simply and Understandably Could The "Personal Computing Experience" Be Programmed? (by Alan Kay)
Video Lecture
Lecture Notes


Contracts in Eiffel: old and new uses (by Bertrand Meyer)
Video Lecture
Lecture Notes


An outline of C++0x (by Bjarne Stroustrup)
Video Lecture
Lecture Notes


Multithreaded Programming in Cilk (by Charles Leiserson)
Video Lecture
Lecture Notes


Faith, Evolution, and Programming Languages: From Haskell to Java (by Philip Wadler)
Video Lecture
Lecture Notes




Have fun with these lectures and get smarter! :)
Don't forget to check out the right menu bar for more lectures! Until next time!


Related Posts

Free Computer Science Video Lecture Courses
(Courses include web application development, lisp/scheme programming, data structures, algorithms, machine structures, programming languages, principles of software engineering, object oriented programming in java, systems, computer system engineering, computer architecture, operating systems, database management systems, performance analysis, cryptography, artificial intelligence)


Programming Lectures and Tutorials
(Lectures include topics such as software engineering, javascript programming, overview of firefox's firebug extension, document object model, python programming, design patterns in python, java programming, delphi programming, vim editor and sqlite database design)


Programming, Networking Free Video Lectures and Other Interesting Ones
(Includes lectures on Python programming language, Common Lisp, Debugging, HTML and Web, BGP networking, Building scalable systems, and as a bonus lecture History of Google)


Computer Science Lectures
(Courses include higher computing (intro to theory of computation), intro to computer science, data structures, compiler optimization, computers and internet, intro to clojure, the akamai story, cryptography, EECS colloquium videos at Case Western Reserve University)


More Mathematics and Theoretical Computer Science Video Lectures
(Includes algebra, elementary statistics, applied probability, finite mathematics, trigonometry with calculus, mathematical computation, pre-calculus, analytic geometry, first year calculus, business calculus, mathematical writing (by Knuth), computer science problem seminar (by Knuth), dynamic systems and chaos, computer musings (by Knuth) and other Donald E. Knuth lectures)


More Mathematics and Theoretical Computer Science Video Lectures
(Includes algebra, elementary statistics, applied probability, finite mathematics, trigonometry with calculus, mathematical computation, pre-calculus, analytic geometry, first year calculus, business calculus, mathematical writing (by Knuth), computer science problem seminar (by Knuth), dynamic systems and chaos, computer musings (by Knuth) and other Donald E. Knuth lectures)


Computer Science Courses
(Includes introduction to computer science and computing systems, computational complexity and quantum computing, the c programming language, multicore programming, statistics and data mining, combinatorics, software testing, evolutionary computation, deep learning, data structures and algorithms and computational origami.)

Free Computer Science Courses

Posted by Hollywood Updates

Hi everyone! This month I have a tasty collection of free computer science courses. All the courses have videos included.

The topics include: Introduction to computer science. Computational complexity and quantum computing. Introduction to computing systems. The C programming language. Multicore programming. Statistics and data mining. Combinatorics. Software testing. Evolutionary computation. Deep learning. Data structures and algorithms. Bonus lecture: Computational origami.


Introduction to Computer Science and Programming (MIT 6.00)


Course Website
Programming Video Lectures
Exams and Solutions

Course description:
This subject is aimed at students with little or no programming experience. It aims to provide students with an understanding of the role computation can play in solving problems. It also aims to help students, regardless of their major, to feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class will use the Python programming language.

Course topics:
What is computation; introduction to data types, operators, and variables. Operators and operands; statements; branching, conditionals, and iteration. Common code patterns: iterative programs. Decomposition and abstraction through functions; introduction to recursion. Floating point numbers, successive refinement, finding roots. Bisection methods, Newton/Raphson, introduction to lists. Lists and mutability, dictionaries, pseudocode, introduction to efficiency. Complexity; log, linear, quadratic, exponential algorithms. Binary search, bubble and selection sorts. Divide and conquer methods, merge sort, exceptions. Testing and debugging. More about debugging, knapsack problem, introduction to dynamic programming. Dynamic programming: overlapping subproblems, optimal substructure. Analysis of knapsack problem, introduction to object-oriented programming. Abstract data types, classes and methods. Encapsulation, inheritance, shadowing. Computational models: random walk simulation. Presenting simulation results, Pylab, plotting. Biased random walks, distributions. Monte Carlo simulations, estimating pi. Validating simulation results, curve fitting, linear regression. Normal, uniform, and exponential distributions; misuse of statistics. Stock market simulation. Course overview; what do computer scientists do?


Computational Complexity and Quantum Computation (Tim Gowers)


Comp. Complexity and Quantum Computation Video Lectures


Course description:
Computational complexity is the study of what resources, such as time and memory, are needed to carry out given computational tasks, with a particular focus on lower bounds for the amount needed of these resources. Proving any result of this kind is notoriously difficult, and includes the famous problem of whether P=NP. This course will be focused on two major results in the area. The first is a lower bound, due to Razborov, for the number of steps needed to determine whether a graph contains a large clique, if only "monotone" computations are allowed. This is perhaps the strongest result in the direction of showing that P and N P are distinct (though there is unfortunately a very precise sense in which the proof cannot be developed to a proof of the whole conjecture). The second is Peter Shor's remarkable result that a quantum computer can factorize large integers in polynomial time.

Course topics:
Equivalence between Turing machines and the circuit model of compuation. Final details needed for the quantum Fourier transform, such as how to "uncompute". Solving the descrete logirithm problem. Definition of P, NP and NP-complete and some examples. A demonstration that clique is NP-complete, and some lower bound complexity proofs which don't work. Razborov's proof that no monotone circuit can solve Clique in polynomial time. No "natural proof" exists for proving a separation between P and NP if one-way functions exist. We then move into a mathematician's description of quantum computation starting from probabilistic computation. Description of quantum computation continued. We start describing Shor's factoring algorithm.


Introduction to Computing Systems


Course Website
Computing Systems Video Lectures
Exams


Course topics:
Computer systems organized as a systematic set of transformations; representation using bits. Bits and Operations on Bits: unsigned and signed integers; arithmetic and logical operations; ASCII; floating point; hexadecimal notation. Digital Logic Structures: gates; combinational logic; storage elements. Digital Logic Structures: memory; sequential logic; clock. The von Neumann Model: basic concepts; instruction processing; sequencing. The LC-3: Instruction Set Architecture. The LC-3: Example program in LC-3 machine language; LC-3 datapath. Programming: problem solving using systematic decomposition, more examples, debugging. LC-3 Assembly Language; examples; assembly process. I/O abstractions: input from the keyboard, output to the monitor. Repeated Code: TRAPs and subroutines; Examples. Stacks; Executing subroutines with stacks. Introduction to C; Variables and Operators: basic data types, simple operators, examples. Operators: simple operators, memory allocation of variables, examples. Control Structures: conditional constructs. Control Structures: iterative constructs, comprehensive examples, problem solving. Functions: introduction, syntax, run-time stack. Functions: activation records, examples. Pointers and Arrays: introduction, problem solving, examples. Arrays: 2D arrays, examples. Testing and Debugging: introduction, error taxonomy, using a debugger. Recursion: introduction, basic example, example showing run-time stack. Input and Output in C: standard library, basic I/O calls, file I/O, example. Basic Data Structures: introduction. Basic Data Structures: structures, defining new types, enumerations, dynamic memory allocation. Basic Data Structures: linked lists. Basic Data Structures: linked lists, linked structure traversal. Comprehensive Case Study: sorting. Simple Guide to C++: Design, abstractions, and implementation. Course Wrap-up and Advice for Sophomore System Builders.


Multicore Programming (MIT 6.189)


Course Website
Multicore Programming Video Lectures


Course description:
The course serves as an introductory course in parallel programming. It offers a series of lectures on parallel programming concepts as well as a group project providing hands-on experience with parallel programming. The students will have the unique opportunity to use the cutting-edge PLAYSTATION 3 development platform as they learn how to design and implement exciting applications for multicore architectures.

Course topics:
Introduction to Cell processor. Introduction to parallel architectures. Introduction to concurrent programming. Parallel programming concepts. Design patterns for parallel programming I. Design patterns for parallel programming II. StreamIt language. Debugging parallel programs. Performance monitoring and optimizations. Parallelizing compilers. StreamIt parallelizing compiler. Star-P. Synthesizing parallel programs. Cilk. Introduction to game development. The Raw experience. The future.


Statistical Aspects of Data Mining (Data Mining at Google)


Data Mining Video Lecture 1
Data Mining Video Lecture 2

Data Mining Video Lecture 3

Data Mining Video Lecture 4

Data Mining Video Lecture 5

Data Mining Video Lecture 6

Data Mining Video Lecture 7

Data Mining Video Lecture 8
Data Mining Video Lecture 9

Data Mining Video Lecture 10

Data Mining Video Lecture 11
Data Mining Video Lecture 12
Data Mining Video Lecture 13
Course Website


This is a talk series being given at Google by David Mease based on a Master's level stats course he is teaching this summer at Stanford.

Course topics:
1. Discussion of locations of potentially useful data (grocery checkout, apartment door card, elevator card, laptop login, traffic sensors, cell phone, google badge, etc). Note mild obsession with consent. Overview of predicting future vs describing patterns, and other broad areas of data mining. Intro to R. 2. Data. Reading datasets into excel and R. Observational (data mining) vs Experimental data. Qualitative analysis vs quantitative analysis. Nominal vs ordinal. 3. Sampling. 4. Empirical distribution function. Histograms. Plots. 5. Overlaying multiple plots. Statistical significance. Labels in plots. 6. Box plots. Color in plots. Installing R packages. ACCENT principles and Tufte. 7. Association Rules. Measures of location. Measures of spread. Measures of association. Frequent itemsets. Similar to conditional probabilities. 8. More association rule mining. Support and confidence calculations. Personalization using rules. Beyond support and confidence. 9. Review. 10. Data Classification. A negative view of decision trees. Decision trees in R. Algorithms for generating decision trees. 11. More decision trees. Gini index. Entropy. Pruning. Precision, recall, f-measure, and ROC curve. 12. Nearest Neighbor. KNN. Support Vector Machines. Adding 'slack' variables, using basis functions to make the space linearly separable. Some comments on Stats vs ML. Intro to ensemble (uncorrelated) classifiers. Random Forests. AdaBoost - Adaptive Boosting. Some discussion of limits of classifiers (nondeterministic observational datasets). Clustering. K-Means.


Data Structures and Algorithms (COMP1927, UNSW)


Data Structures and Algorithms Video Lectures


Lectures by Richard Buckland from The University of New South Wales.

Course topics:
Abstract data types. Stacks. Queues. Recursion. Time and Space Complexity. Big Oh Notation. Complexity Analysis. BFS (Breadth First Search). DFS (Depth First Search). Trees. Tree Algorithms. Self Balancing Trees. Graphs and Graph Algorithms. C99 Extensions. Unit Testing. Debugging. Pair Programming.


Peter Gibbons Memorial Lecture Series


Peter Gibbsons Memorial Video Lectures


The Combinatorics at the Heart of the Problem


Making Software Testing Easier


Developing Darwin's Computer


Technologies for Deep Learning



Computational Origami


Erik Demaine on Computational Origami


Lecture description:
As a glassblower, Tetris master, magician, and mathematician, the MIT professor has spent his life exploring the mysterious and fascinating relationships between art and geometry. Here, he discusses the potential of lasers, leopard spots, and computer science to breathe new life into everything from architecture to origami. Demaine has a "hard time distinguishing art from mathematics." His approach to art has a strong emphasis on collaboration, which as he says, is a rare thing in art. Demaine is a professor in computer science and mathematics. He realized that "mathematics (itself) is an art form." During the talk, he also mentions Escher's study of mathematics.


Have fun with these lectures!


Related Posts

Free Computer Science Video Lecture Courses
(Courses include web application development, lisp/scheme programming, data structures, algorithms, machine structures, programming languages, principles of software engineering, object oriented programming in java, systems, computer system engineering, computer architecture, operating systems, database management systems, performance analysis, cryptography, artificial intelligence)


Programming Lectures and Tutorials
(Lectures include topics such as software engineering, javascript programming, overview of firefox's firebug extension, document object model, python programming, design patterns in python, java programming, delphi programming, vim editor and sqlite database design)


Programming, Networking Free Video Lectures and Other Interesting Ones
(Includes lectures on Python programming language, Common Lisp, Debugging, HTML and Web, BGP networking, Building scalable systems, and as a bonus lecture History of Google)


More Mathematics and Theoretical Computer Science Video Lectures
(Includes algebra, elementary statistics, applied probability, finite mathematics, trigonometry with calculus, mathematical computation, pre-calculus, analytic geometry, first year calculus, business calculus, mathematical writing (by Knuth), computer science problem seminar (by Knuth), dynamic systems and chaos, computer musings (by Knuth) and other Donald E. Knuth lectures)


More Mathematics and Theoretical Computer Science Video Lectures
(Includes algebra, elementary statistics, applied probability, finite mathematics, trigonometry with calculus, mathematical computation, pre-calculus, analytic geometry, first year calculus, business calculus, mathematical writing (by Knuth), computer science problem seminar (by Knuth), dynamic systems and chaos, computer musings (by Knuth) and other Donald E. Knuth lectures)


Pure Computer Science
(Includes basics of computation theory, intro to computer science, data structures, compiler optimization, intro to computers and internet, intro to clojure, and some videos from EECS colloquium at Case Western Reserve University.)