Sunday, 14 June 2009

REAL TIME GRAPHICS AND RENDERING


The presentation focus on the giving the general concepts on how graphic can be produced by computer programs. The are several terminologies which was used in the presentation. Raster Image is the two dimensional image, Vector Image is an image which is created by mathematical algorithms as three dimensional images, Modeling is process of creating images similar to actual things, Texture is the process of building the surface, Animation is the rapid display of image in sequence, Rendering is the final process of creating an actual 2D image or animation from the prepared scene.
The history of computer graphic goes as early as ages of the 1960s the time when computer technology was rapidly gaining power to stand onto its feet like sketchpad program spacewar game and other animated graphics. In 1990s many films were created by graphical animation e.g. terminator 2, Toy story, beauty and beast. Many scenes contained 3D animated objects which were flat shaded with bright colors so as to blend with characters.
Computer graphics refers to any picture or series of pictures that is generated with the aid of computer. By discussion computer graphics refers to 3D graphics and not images created using 2D images or painting programs such as Photoshop, Gimp or Painter. The difference between 2D and 3D is that, most of 2D software is bitmap (raster) based, all 3D are vector based. Real time Graphics refers to computer graphics that signifies reality.
The field is applicable in movie industries where computer graphics characters are designed and associated in the movies. In other cases the full computer graphics movies are produced e.g. The movie of Madagascar and Bee Movie, also some times Movies that involves both computer graphics characters an humans are produced. E.g. Lord of the Ring, Star wars. The field is also applicable in the car model for designing car engine and the whole model. Games are designed by creating actors by computer graphics which animate the models. Medical Imaging and scientific visualization are used to design the teaching tools and diagnosis.
The advantages of this field is –
>It minimizes the cost of actual production/ construction in the field of Engineering.
>It gives more flexibility and allows complex effects and movies to be produced in film production.
>In games it allows real objects to be designed and programmed so that gaming becomes more interesting
>In medicine computer graphics simulation of body organs helps the process of studying the body easier.
The disadvantages of the computer graphics are as follows-
>It takes too much time to design the models depending on the size of scene.
>The computer graphic applications have got complicated interfaces which is difficult to be adopted by novice people
>Requires fast machines for fast rendering
>It takes the part of human actors in some of the movies/films thus these people are fired from their jobs
The computer graphics is the very complicated field though it is important to many fields like medicine, engineering, film production and all other technologies

Reference:
T. Strothotte, Computational Visualization: Graphics, Abstraction and Interactivity, Springer Verlag, 1998.

J. Foley, A. van Dam, S. Feiner, and J. Hughes, Computer Graphics: Principles and Practice (5th Ed),Addison-Wesley, 1993.

IRIS RECOGNITION

The topic aimed to give the general ideas on how to identify the human by using the iris. The topic was well organized and presented to the class, Iris is the muscles within the eye that regulate the size of the pupil and controlling the amount of light that enters the eye. Iris recognition is the process of analyzing the randomly pattern of the iris.
Before the recognition the iris is located in the landmark features, localization of iris is important step in iris recognition because if it is being done improperly can result to noises. Also, require the use of high quality digital camera, but nowadays commercial iris cameras that use infrared to illuminate the iris without causing harm is used.
The vector on the iris is taken and are converted in the codes and then stored in 256 bits. The vector will give the position are of formation of the vector. This vector includes the information on the orientation and spatial frequency and the position of this area. Iris pattern are described by an iris code, using the phase information collected in phase. The polar coordinate systems included in the description of the iris are control bytes that are used to exclude eyelashes, reflection and other unwanted data.
The advantage of Iris recognition
1 Ease usage.
2 Certain conclusion ands minimum error rate.
3 Since the iris is unique, this ensures maximum security.
4 Identification of a persons who have limited use of hands or arms.
5 Is its stability, or template longevity.

Challenges of iris recognition-
• Distance scanning.
• Cost.
• Border control.
• Environmental challenges.
• High quality image

So far to recognize someone can be done the many other ways like speaker recognition, fingerprints and others using the iris recognition can be helpful in ensuring the security by identifying the real person.

References
1 .Digital Image Processing (2nd Edition), R. C. Gonzales, R. E. Woods, 2002.
2. Biometrics, J. D. Woodward, N. M. Orlans, P.T. Higgins, McGraw-Hill Osborne Media, 2002
ARTIFICIAL INTELLIGENT
This topic was the very interesting topic as far it was giving the very important concepts about the simulation of the human behavior. Edward coelestin and Dastan Kamuzola presented the topic. Artificial intelligent is the phenomena, which imitate the human behaviors. The presentation focuses how some mechanical devices actually are demonstrated to behave with some degree of intelligence like human being.
The artificial intelligent history goes as far as the ancient Greek mythology(from 5th-20th century). In 1950 AM Turing introduced the Turing test as a way of operationalizing a test of intelligent behavior. The modern history e.g. Nomad robot explores remote regions of Antarctica looking for meteorite samples and brought the good report. Now Artificial intelligent is the capability of a device to perform functions that are normally associated with human intelligent e.g. reasoning, thinking, interpreting etc. while robotics is the science of robots and their design, manufacture and applications.
Artificial Intelligence is characterized with the following, namely:-
1 Reasoning depends on logic criteria inserted by the programmer otherwise the robot does not reason as people do (e.g. Voice recognition)
2 Multi-agent Planning can perform different tasks at once
3 Communication ability by artificial intelligent can communicate according to the information given.
4 Identification can be (e.g. Finger print & The Nomad robot explores.)
5 Manipulation of object
6 Interpretation
Examples of artificial intelligent
1 Machine translator
2 Location detection e.g. tags
3 Automatic essay assessment
4 Electronic sensor e.g. weapons, drugs etc
5 Black box in an airplane
6 Robots

AI can be applied in some areas like
Supermaket
Scientific experimentation
In sport and games
Domestic Activites
Location detection
Security affairs


Some of advantagies of AI are:
Simplify work
It is efficiency and accuracy
It reduces misleneous ambiquity e.g. (finger print)
It enhance good communication( natural language understanding & translation)

When expecting good, bad also comes.
Some disadvantages of AI are:
Initial cost is high or expensive
It requires highly skilled expert
It led into loss of job
Difficult to implement esp. in some third world countries

After modern computers be available, followed by 2ndWorld War , it has become possible to create programs that perform difficult intellectual tasks and tools are constructed which have applications in a wide variety of everyday problems


Reference:
Feigenbaum, E.A. & Feldman, J. (eds.) Computers and Thought. NY: McGraw-Hill, 1963.
Gardner, Martin. Logic Machines & Diagrams. NY: McGraw-Hill, 1958.
McCorduck, Pamela. Machines Who Think. San Francisco: W.H. Freeman, 1979

Saturday, 13 June 2009

PROGRAMMING AND COMPILER

The topic aimed to give the class different concepts which associates with programming. Programming is the widen topic which associates with different languages which are used to write different programs. In the presentation, there are several terminologies, which were used. Compiler is the program that converts from some source code to machine language. Software is the collection of set of programs to accomplish a certain goal. Programming language is the artificial language used to write a sequence of instructions that can be run in the computer. Machine language is representation of a program, which actually read and interpreted by the computer. Decompiler is the program that translates from a low-level language to the higher-level one.
The computer programming languages are divided into five types, which are-
Machine language that is directly understood by machine languages, the lowest level of programming languages (0s&1s). The second is the assembly language, which need a translator or converter to be understood by the machine. Thirdly is the Higher level language resembles some how to human language can be used in more than one kind of machine, the language is interactive user and computer communicate with each other directly during the writing and running program these language are like PASCAL, C/C++, COBOL, FOTRAN, PYTHON, JAVA etc. Fourth is the Very higher-level language allow users to specify the desired results without having to specify detailed procedures needed for achieving the results is like MYSQL, ORACLE, VISUL BASIC etc. Fifth is the Natural language programming language(NLP) the development steps are being developed make human more interactive to computer programming.
Selection of language greatly depends upon the nature of activity and availability of the simplest language to solve the preferred task or problems. There is no best language might depend on many things, which may be type of program, reason the program is built, size of program, programmer familiarity, ease of program verification etc. For stance VISUAL BASIC(1990s) for development of Microsoft Window application(writing interface), COBOL(1950s) for commercial application and data manipulation, FORTRAN(FORmular TRANslator-1950s) for doing complex mathematics for scientific engineering application, C/C++ for writing and developing operating systems and JAVA (1995) used for world wide web application and development.
The importance of computer programming languages.
Important for students in all disciplines of computer science
To improve your ability to develop effective algorithms
To increase your vocabulary of useful programming constructs
To make it easier to learn a new language
To make it easier to design a new language
Soft ware development, Interface writing etc
The challenges of programming languages are as follows-
Currently the programming languages are challenging on learning.
New languages are being invented
Need for skilled programmer
Difficult in debugging
Programming and Compiler this is a excellent field research to be aware of because it assist the user to recognize numerous things applying on the device.



References
o Principles of Programming Languages, Bruce J. McLennan
o Programming Languages Concepts and Constructs, Ravi Sethi
o Programming Languages Concepts, Carlo Ghezzi, Mehdi Jazayeri

EDUCATIONAL TECHNOLOGY

This topic aimed at helping the students to understand how technology can ease the method of teaching and learning. The lesson was presented by Bahati Sanga in collaboration with Austin Jofrey. Education technology is complex as far as it involves people, procedures, ideas, devices and organization. Educational technology is the knowledge about teaching, learning and conditions of learning to improve the efficiency and training.
Education technology is the way of identifying, solving and facilitating learning process by using the available resources/tools. It facilitates learning by control of environment, media and methods by analyze the characteristics of learner and organize the content in logical. The main goals of of education technology are to teach by involving the techniques, which will bring about the behavior change toward that knowledge gained. Here is another thing, which is similar to educational technology that is technological education is the situation where by education is used to construct stuffs for education like teaching tools, some times these two things are used interchangeably.
The evaluation of the learner’s performance follows different paradigms which focus on the input, process, output, the nature of the learners and the nature of the content (subject matter) whether they have achieved the educational objectives.
The benefits of educational technology are many but some of them are-
Gives a structure, which make easily to measure the improvement of outcomes.
Easy to access course materials e.g. By the use different materials from the internet has helped the students to change positively or not.
Widen the participation through the internet
Improves students writing and language structures e.g. different correction in the word.
Subjects are easy to learn through a variety of educational software e.g. computer simulators graphics
Motivation to the students.
Also there are some criticism put forward by those who challenges the system of learning whether the technology can replace the work of the teacher, this is found no the work of a teacher does not be replaced by any technology because it lack motivation. More challenges are as follows-
· It involves special training e.g. the knowledge of using the devices like computers to both trainers and students.
· Without proper training teachers and students cannot benefit from the devices that will improve the quality of education.
· High costs to invest and run these devices need funds.
· It needs the proper administration of these devices.
To change the system always face a lot of problem and challenges but in order to improve the educational performance we must match with new technology. The evaluation of the learning environment, the learners’ behavior the nature of the subject matter will not be helpful if the advancements of technology will not be associated to the learners.

Reference.
Wicklein, R. C., & Schell, J. W. (1995). Case studies of multidisciplinary approaches to integrating mathematics

Zeisset, C. (1989). Many ways to cut a pie. Bulletin of Psychological Type

Zuga, K. F. (1989). Relating technology education goals

PARALLEL COMPUTING

Parallel computing this topic has been portrayed in order to give the general concept on how the computers (CPU) are organizing their activities. Abdala Sasya presented the topic with collaboration Mary Mwakisisile. The presentation was good so far, it gave different concepts to the class. Traditionally the software to be run in the computer was written in serial way, this made them to function in single computer having a single CPU.
CPU is the integrated electronic device that interprets instructions to the computer, performs the logical operation, and causes the input/output operations to occur. Under serial computation, the problem is broken into a discrete series of instruction and executed one after another at a moment.
Mathematician Neumann invented the idea of parallel computing when he was asking himself if a single CPU can solve a problem in ten seconds, then could ten CPU solve the same problem in one second. To get the answer for that question took long run; so he can explain that the parallel computing is the form of computation in which many tasks are carried out simultaneously then software was programmed in parallel computation in order to be run using multiple CPUs. Under parallel computing, a problem is broken into discrete parts that are solved in at the same time.
Those are forms of parallel of computing; Bit level parallelism is the amount of information the processor can execute. Instruction level parallelism controls how many operations in one computer program can be performed concurrently. Data parallelism is the distributing the data across different computing nodes. Task parallelism the different calculations either can be performed on the same or sets of data. These forms of parallel computing can be used in electrical engineering e.g. circuit design, Computer science e.g. Mathematics manipulation. Physics applied e.g. nuclear atoms.
The importances of parallel computing are those:-
o Parallel computing saves time in the sense that the concurrent execution of the problems limits the time.
o Solves even large problems which are complex
o Data sharing is also possible due to fact one machine may be doing normal activities and transferring other files.
Challenges of parallel computing.
The system is very expensive due to facts mounting many chips need good technology
Increase the traffic congestion to the shared memory (CPU path).
There must be a programmer to ensure correct access of global memory.
Parallel computing is the great idea to computer experts as it serves time to execute any problem also sharing information and solving large problems.



References:
.htt://en.wikipedia.org/wiki/computing.
.Leadbetter chips, (2004) Computer studies and information technology. Cambridge university.

SOFTWARE ENGINEERING

This topic aimed at exposing the students at the general concept of software development are the rules regulating the use of that software. Nelson Shoo with collaboration with Christin Obed presented the lesson. The presenters brought about the real concept on how the software engineering started and developed to the way it is now. Software engineering can be explained as the systematic approach to the design, construction, develop and maintenance of the computer programs.
The Historical background of software engineering goes as back as 1950 when programming languages started to appear such as Fortan Algon and Cobol in the world of computing. In late years of 1963-1968 then emerged the software crisis where in 1968-1970 it was necessary to the birth of software engineering as profession appeared in NATO software conference that was held in 1968 to solve the software crisis.
In 1960s the software were sold together with the computer so the software engineers were not at all paid for their hard job that they did, so they were discouraged with such act. By so doing most of them, they didn’t develop software any more and lead to that software crisis. The resuming the software engineering as profession was to overcome the crisis which were invading the world of of computing. Also aimed at creating the software of high quality, cheaper, maintainable and deliver them at time.
The software can be classified into different four categories, which are-
Retail software is sold off the shelves of retail stores
OEM software (Original Equipment Manufacturer) this refers to software sold in bulk to reseller, designed to be bundled with hardware e.g. Microsoft.
Shareware is the software which is downloaded from the internet but after a several period of time the customer has to purchase
Freeware is the software, which is downloaded from the internet free, but person use, while commercial use it requires a paid license.
The modern development of the software is done in join venture of different group of professionals in active collaboration of the customers who knows what are their needs (what do they need the software to do for them).
The methodologies of software engineering are-
Objects Oriented Programming (OOP) is the computer program style that uses data structure to design application and computer program.
Rapid application development (RAD) refers to a type of software development life cycle, which uses minimal planning in favor of rapid prototyping.
Scrum (all at one approach to software engineering) is the type of methodology where different people with experience work together to manage complex work, such as new product development.
Team software process is defined operational process framework that is designed to help teams of managers and engineers to organize and produce large scale software projects.
The challenges, which face software engineering, are-
¨ Heterogeneity challenge that every day the needs are diverse, people are demanding different things by which the engineers are not yet made the software to serve those demands.
¨ Delivery challenge is the time bound between the customers and the engineers, always the engineers do not meet the time bound the mean the delivery takes too long time.
¨ Trust challenge sometime the customers do not understand what is possible and what is impossible with software at that particular time.
¨ Legacy system challenge it is possible to customer to amount money or property to engineer to avoid postponing the product.
The criticism to software engineering are-
o Fail to manage the expectation.
o Poor requirements from customers.
o Rising complexity requirement and user expectation of the customer.
o Ongoing change of technology.
o Ongoing failure of technology.
o Failure to pinpoint the causes of problems
o No theorems about people and projects.

As far as the software engineering is concerned has green future (job opportunities) we have to work hard so that we meet the people’s demands and behave ethically in a responsible way.



References
Ian Sommerville (2000) Software Engineering 6th Edition, chapter 1

www.freetechbooks.com/software-engineering-methodology-the-watersluice

PROGRAM / SOFTWARE VISUALIZATION

The presentation aimed at equipping the students with prior knowledge of how the made programs appears in different platforms. Venance Luhemeja with collaboration with Rachel Myinga did the presentation. Now visualization is the phenomena that involves with making things visible or observable to the mind or imagination while program is the set of executions instructions that solves the difficult problems. Then software visualization consists of producing animated views of program executions.
The visualization can be classified into two categories, which are Visualization of a single component is the one which source code and quality defects during software development and maintenance activities. The second is visualization of the whole system; whose investigates the architecture or to apply or visual analytics techniques for defect discovery. The aims of the categorizing the visualization is:-
¨ Achieving the goals of systematic creation of visual representations
¨ Binding of data to representations that can be recognized e.g. visual, auditory, and tactile.
¨ Specification of user explanations

The following are the importance of software visualization as are found with the software programmers
¨ Improves the performance of different programs.
¨ The support of the visible is more comprehensive
¨ Helping the conversion of data into graphic or graphic representation.
¨ Helps the programmers to understand program behavior and code better.
There are some disadvantages which associates with the program visualization some of those are enumerated below as follows:-
¨ Difficulties of getting necessary data for visualization.
¨ Problems on the limited screen space for visualization of the whole codes.
¨ Unfavorable condition or circumstance e.g. runtime environment.
¨ The aspect of program behavior to be visualized must be identified.
The visualization is highly challenged by the situation under which it works. Normally when the console like jeliot identifies the program. More challenges are as follows:-
¨ Handle real-world problems, the use of little lines of codes does not help in solving real world problems.
¨ Security upon program/ software.
¨ Network latency
¨ Designing and specifying
The visualization is an important as the future professional carriers let work hard on different visualization programs by which all the problems can be handled well. The all visualization which are accompanied by program writing has the great job opportunities in the long run.

DATA MINING

The presentation was conducted jointly between Joshua Shendu and Remy Kaaro, The presentation was well planed and presented by the all presenters because it was well understood and interesting. The lesson was aimed at giving the awareness to all students on how they can get and give varieties of information to other people. Now, data mining is the process of extracting the hidden patterns from large amount of data. As the words themselves are in relation to minerals, people have been manually extracting the information from data for centuries but the data are increasing arithmetically hence they are in need of modern methods of extracting those in formations, which is known as automatic approaches.
The data mining can be categorized in several forms, which are relational database and social network data mining. This involves the mining of data that has good relationship in the database. Text mining this involves the finding of the written information on different documentary places. Audio mining involves the finding of information, which is in sound form. Video data mining are data finding which are in animated pictures are is motion. Image data mining is the finding of information that are extracted from the images. Web data mining is the mining is the search of information, which are found in the WebPages.
The very important part was the analysis of three stages of data mining which is the exploration of the information. This stage starts with the data preparation that also involves the cleaning of data and data transformation. The second stage is the Model building and validation, this is the very important part where the data user get confirmation on the validity of those information, also choose the best one based on their predictive performance. Third is the deployment, this is the final stage which involves using the model selected as best in the previous stage and applying it to new data in order to generate predictions or estimates of the expected outcomes.
Moreover the presentation covered on the advantages of data mining which started that are used by marking different achievements which reached by a certain integrated projects. Banking systems are the most stakeholders of the data mining activities for search the nature and type of customs they could have in different societies. The law enforcing organs are also using the data mining for security purposes e.g. police military and other of the like. Researchers are also using the data found in the system and adding more information in the inventory.
Further more they presented the disadvantages of data mining where they analyzed about security issues where some information can be licked to the sabotage groups like criminal groups. The misuse of the information where some of the information is privacy the expose of that information is not putting some people in harmonious situation.

FINGERPRINT RECOGNITION

Fingerprints aims equipping the listeners with the knowledge of identifying a person by impression left somewhere rather than physical appearance, voices and other sensory data. Biometrics is the unique recognizing human based upon one or more intrinsic physical or behavior traits including face, iris retina scanning, voice identification and others. The fingerprint is one of the most convenient and foolproof.
Fingerprint recognition refers to the automated method of verifying two or more human fingerprints so, fingerprint is the impression left upon any surface with which the finger comes in contact under pressure. The use of fingerprints goes as far as 1880 when Dr. Henry Faulds published the first scientific account on the use of fingerprints as the means of identification and later in 1960s, Galton pointed out how to develop an automatic fingerprint technology. In 1969, FBI developed a system to automate its fingerprint identification process and the automation of fingerprints continues to improve until present time.
The fingerprints are used in the fact that the fingerprints are very unique to every individual no two people could have identical fingerprints. Not only that but also the fingerprint pattern of any one remains unchanged for life. Moreover, the limited variety in patterns allows systematic classification of an individuals fingerprints.
The finger pattern is classified as first whorls by where the pattern starts in the middle and keeps getting bigger. Secondly are arches their shape starts on one side and ends on the other side like a hill, Thirdly is the loops starts on one side and goes around to end on the same side.
The advantages of fingerprints is as follows-
Minimizes the chance for fraud as every individual person has unique fingerprints.
Can be applied to modern computers, cars, automatic doors.
The fingerprints in any situation cannot be lost
Naturally the fingerprints can not be changed
The reliability and stability is higher compared to the iris, voice and face recognition method.
The equipment the identifying the fingerprints are relatively low- priced compared to others.
Disadvantages of fingerprints are as follows-
Some of the criminals burn their fingers with acids
Time taken in order to take the fingermarks and identifying them is too much
Sometimes the new technology got some error, don’t trust new technology.
The challenges delimites the exact use of the system, the challenges which is faced by the fingerprint technology are-
Fingerprints stored by the database can be easly determined by hackers.
The process of storing fingerprints weakens security.
Due to technical problem some sensors do not read fingerprint images properly.
In some cases the cruel criminals do destroy their fingers

The fingerprints is important to use in identifying the uniqueness of someone more effectively more than DNA analysis because it fail to give the distinguishing between the identical twins but with fingerprints two identical twins are identified.



Reference:
Davide Maltoni, Anil K. Jain, Dario Maio, Salil Prabhakar,(2002), Handbook of fingerprint recognition, springer.

Henry, Edward R., Sir (1900) Classification and Uses of Finger Prints London: George Rutledge & Sons, Ltd

Wednesday, 22 April 2009


IMAGE COMPRESSION
The Phenomena of image compression is much refers to the digit images which contains the bits so image compression is the solution of such problem. Data compression is the method of reducing the data files into smaller. Data files are those files which contain moves, images text etc, and image files are those which contain images only. The presentation was very nice in the fact that all things given were valid to the professionalism.
The image compression is very important in the new digital technology as far as the image transfer in the network is faster, saves space in storage devices and saves time in uploading and downloading. Images are of different formats which are TIFF, JPEG, GIF, PNG and BMP al can be compressed to some extend. Mainly there are two types of image compression which are Lossless and Lossy. Lossless is the type of image compression where by the quality of the image is not lost e.g. GIF (Graphical Interchange Format) while Lossy is the one which when the image is compressed losses it Quality e.g. JPEG (Joint Photographic Experts Group).
Advantages of the image compression depend on the usability of the use some of them may be:-
Ø Reduces the data storage requirements.
Ø Reduce the time for images to download and upload.
Ø E-mail attachment are easy taken if they are well compressed.
Ø Images, Video, sound clips which are attached in the web pages should be small in size to save the purpose.
Ø There is special websites for photo sharing .

Disadvantages of image compression
- Reduce the reliability of image records.
- Reduction of information or bits.
- Time consuming during compression.
- Disruption of data properties.

The compression of image starts from the origin quality to the poor quality image so ones the image is compressed to a certain poor quality and saved it will not be retained to the original quality. In sake of not loosing the original quality do not save the new format if you think you need still proceed in compressing and image.

Saturday, 11 April 2009


AUTOMATIC ESSAY ASSESSMENT.
This is the learning report on research field on computer science which focuses on the automatic essay assessment. The central idea of this topic was to initialize the idea of automatic essay assessment to all people who deals with the computer literacy. Automatic essay assessment is the use of electronic machine to make essay evaluation. This started as long as 1966 where by the researchers conducted the project called Project Essay Grader (PEG). The researchers implemented the software which measures the total number of words and length of essay.
The aim of automatic essay assessment is to prevent plagiarism to the students in some cases the students when they are given the assignment they are copying from the internet or books and it is difficult to figure out the problem by manual work, then by using this type of software it is easier for teacher to find the problem. Secondly it improves the marking of essays to be reliable sometimes teacher ideas biasing students by using different criteria e.g. relationship etc by using the programmed software teacher may have no chase to exercise the favoritisms habit. Thirdly automatic essay assessment improves the individual and detailed feedback to the students which is relevant to what they have done in their assignment, so the learning outcomes may be evaluated and the grades are well determined.
Advantages of automatic essay assessment can be categorized into two parts to teachers and to the students
To teachers
Automatic essay assessment saves time in marking process, the teachers when marking manually they get a lot of problems as long as there work would not be completed on time. The time allocated for general assessment in the institution will be met by the use of the automatic essay assessment. Not only that but also it help to make a critical final grade and they might be to objectives set. Moreover make always the teacher consistency and avoids the biasness to his/her students
To the students
The automatic essay assessment makes the students to be in high challenge to think critically that their essays will be assessed by the software so they would have to write their essays which would meet the criteria which are set by their instructors. The feedback which would be back is real and relevant to what they did in their essays. More than that it looks at the qualities of the students work and not comparing with others because each essay is assessed as it is so reflects the learning process of that particular student.
Problems
Wherever there is good comes also bad things foristance the costs to be encountered by making the software which could identify those criteria is very high such that the schools can not avoid to purchase them. As long as the teacher are reading many essays they get many ideas they the use of soft wares to assess them many lead to the reduction of thinking capacity of the teachers. Not only that but also they need the high qualified people in Information technology to set up and run those machines.
Challenges
In most cases the Tanzanians students at lower level do not know how to use the computer, they simply write on hard copy which can not be lead by the computer so we berg to those who can make a software which can even assess the essay in hardcopy. Also the machine can assess each and every thing which appear in the essay and give back the whole essay with those corrections.

Friday, 10 April 2009

SPEAKER RECOGNITION


SPEAKER RECOGNITION
The presentation aimed at giving the general view of how human beings are able to recognize someone’s voice by hearing in relatively to the machine how can identify two different voices. Speaker recognition is the process of automatically recognizing speaking basing on the recorded voice. The lesson was very pleasant as the clarification was quite easy to understand. The speaker recognition is of two kinds which are speaker verification and speaker identification
The system works under the principle of pitch variation, the sound pitch which is subjected into disparities is recorded baring into mind that the sound is in analogous signals. The interesting issue is how the analog speech is converted into digital signal then upon comparison of current speech and stores the previous speech.
The speaker recognition are used for sake of security purpose where the user has to do several trial of recording their voices so the machine is just taking the average of sound pitch and stores it. Once the user comes and wants to user the machine would repeat the same words which will be recorded now the machine will compare the recorded and the new sounds. Also speaker recognition is used to control of access to restrict the services more than that may be used be military operations to fire or bomb some areas by commanding the weapons by using merely words “shoot”
There are some challenges of speaker recognition which are encountered in speech recognition is about the exact unit to be recognized by the machine. On finding the boundaries of these units in the signal it is difficult to find what is the really voice of the user. Another problem is the variation of the sound due to some unavoidable circumstances e.g. influenza the sound vary and the machine deny to allow you to service to the really user.
To minimize the speaker recognition problem the following may be done
Ø Using the exact words required
Ø Maintain the voice pitch
Ø Be aware of the environment condition
Reference:
M.R Schroeder(1985) -Speech and speaker recognition.
http://www.speech.cs.cmu.edu/comp.speech/Section6/Q6.6.html

Thursday, 2 April 2009

CRYPTOGRAPHY
The lesson aimed at helping the student to understand the methods which are used to hide the texts from being understood by other people, who are not concerned with them. The encryption of text is the old style because it started as long as the great Rome Empire existed but the technology were invented by the Egyptians. The subject was interesting as long as the same technology is used today in computer and computer networks to encrypt and decrypt the text which is not to be leaked to unwanted people but those are being done digitaly. The keen thing which we should examine when we want to hide our text is the type of keys we want to use in cryptography .The types of keys to be used are either private or public which are used universaly. Private (secret) keys uses a single key for both encryption and decryption while Public keys uses pair of keys for encryption and another different pair for decryption.
The cryptography in most cases are used in government for securing their information (data), the banking system are using cryptography so as to secure the financial information of the customer in their account numbers, ATM cards, Visa cards etc. Also the spies e.g. FBI, Red scorpions etc. More over the cryptography is used by army to conceal the security information. The most remarkable issue is the use of cryptography is in the Credit card companies and personal e- mails where the users are just assigned with the special secret number which they use in encrypting and decrypting their data.
The cryptography has being subjected into so many challenges which leads to great loss of the information. The very bad behaviors with which the cryptography are that the data may be exposed to crackers who can steal those data and used in sabotages. Further more the digital cryptography needs the skilled personnel whom has to study hard in school so the cost of educating someone, so as to be a professional software engineer, who can assist the whole system. The technology itself is of high cost due to the fact that the devices deployed is of great cost.
References
http://www.wikipedia.com
Alexander,MD (1993), Protecting Data With Secret Code, Info security News

Tuesday, 10 March 2009

"Rising fields" is the name given to the blog created by me (Innocent Kihaka). I have published the named blog for sack of posting all my ideas concerning all issues which we would learn in a period of one week. Now why rising fields? It is rising field because it is my first time to host a learning blog in the field of computer science.