Saturday, February 12, 2011

Artificial Intelligence

Artificial Intelligence (AI) is nothing but the branch of computer science which aims to create intelligence of machines. John McCarthy defines it as "the science and engineering of making intelligent machines."
Today, AI has become an essential part of the technology industry, providing the heavy lifting for many of the most complicated problems in computer science. AI research is highly technical and specialized, deeply divided into sub-fields that often fail to communicate with each other. Sub-fields have grown up around particular institutions, the work of individual researchers and the solution of specific problems etc. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. General intelligence is still a long-term goal of (some) research.
Branches:
Some of the branches of AI are listed as follows:
Logical AI:
What a program knows about the world in general the details of the specific situation in which it must act, and its goals are all represented by sentences of some mathematical logical language. The program decides what to do by inferring that certain events are appropriate for achieving its goals.
Search:
AI programs often examine large numbers of possibilities, e.g. moves in a chess game or inferences by any theorem. Discoveries are continually made about how to do this more efficiently in various domains.
Representation:
Facts about the world have to be represented in some way. Usually languages of mathematical logic are used.
Inference:
From some facts, others can be inferred. Mathematical logical deduction is adequate for some purposes, but new methods of non-monotonic inference have been added to logic since the 1970s. The simplest kind of non-monotonic reasoning is default reasoning in which a conclusion is to be inferred by default, but the conclusion can be withdrawn if there is evidence to the contrary. For example, when we hear of a bird, we man infer that it can fly, but this conclusion can be reversed when we hear that it is a penguin. It is the possibility that a conclusion may have to be withdrawn that constitutes the non-monotonic character of the reasoning. Ordinary logical reasoning is monotonic in that the set of conclusions that can the drawn from a set of premises is a monotonic increasing function of the premises.
Pattern recognition:
When a program makes observations of some kind, it is often programmed to compare what it sees with a pattern. For example, a vision program may try to match a pattern of eyes and a nose in a scene in order to find a face. More complex patterns, e.g. in a natural language text, in a chess position, or in the history of some event are also studied. These more complex patterns require quite different methods than do the simple patterns that have been studied the most.

Learning from experiences:
The approaches to AI based on connectionism and neural networks specialize in that. There is also learning of laws expressed in logic and is a comprehensive undergraduate text on machine learning. Programs can only learn what facts or behaviors their formalisms can represent, and unfortunately learning systems are almost all based on very limited abilities to represent information.
Heuristic:
A heuristic is a way of trying to discover something or an idea embedded in a program. The term is used variously in AI. Heuristic functions are used in some approaches to search to measure how far a node in a search tree seems to be from a goal.
Epistemology:
This is a study of the kinds of knowledge that are required for solving problems in the world.
Ontology:
Ontology is the study of the kinds of things that exist. In AI, the programs and sentences deal with various kinds of objects, and we study what these kinds are and what their basic properties are.
Applications:
Some applications of AI are depicted in following figure:


Playing Games :
You can buy machines that can play master level chess for a few hundred dollars. There is some AI in them, but they play well against people mainly through brute force computation looking at hundreds of thousands of positions. To beat a world champion by brute force and known reliable heuristics requires being able to look at 200 million positions per second.
Speech recognition:
Computer speech recognition reached a practical level for limited purposes. Thus it is possible to instruct some computers using speech which is more convenient than using keyboard and the mouse.
Understanding natural language:
Just getting a sequence of words into a computer is not enough. Parsing sentences is not enough either. The computer has to be provided with an understanding of the area the text is about, and this is presently possible only for very limited areas.
Computer vision:
The world is composed of three-dimensional objects, but the inputs to the human eye and computers' TV cameras are two dimensional. Some useful programs can work solely in two dimensions, but full computer vision requires partial three-dimensional information that is not just a set of two-dimensional views. At present there are only limited ways of representing three-dimensional information directly, and they are not as good as what humans evidently use.
Expert systems:
A ``knowledge engineer'' interviews experts in a certain domain and tries to represent their knowledge in a computer program for carrying out some task. How well this works depends on whether the intellectual mechanisms required for the task are within the present state of AI. When this turned out not to be so, there were many disappointing results. One of the first expert systems was MYCIN in 1974, which diagnosed bacterial infections of the blood and suggested treatments. It did better than medical students or practicing doctors, provided its limitations were observed. Namely, its ontology included bacteria, symptoms, and treatments and did not include patients, doctors, hospitals, death, recovery, and events occurring in time. Since the experts consulted by the knowledge engineers knew about patients, doctors, death, recovery, etc., it is clear that the knowledge engineers forced what the experts told them into a predetermined framework. In the present state of AI, this has to be true. Thus the usefulness of current expert systems depends on common sense.
(


This article is the topic of 7th unit from RTMNU MBA 3 rd sem IT syllabus notes.Further topics will be covered in upcoming blogs For more notes you can also refer to other links as given below:


)

Monday, February 7, 2011

Object Technology

Object Technology
Technology which is based on some specific feature or concept is referred as Object oriented Technology or Object Technology. Thus the concepts of Object Technology are as follows:
Concept of Object Technology:
Objects: Objects are nothing but the entities of the class accessing variable members and member functions of that class.
Classes: Collection of similar type of object is called as class. For example, class student is the collection of students having some common characteristics.
Data abstraction: Data abstraction refers to the act of representing essential features without including the background details or explanations. Classes use the concept of abstraction and are defined as a list of abstract attributes.
Data encapsulation: The wrapping up of data and functions into a single unit is known as encapsulation. Object oriented paradigm is based on encapsulation of data and codes whose contents are not visible to outside world. Conceptually the interaction thus occurs through massage passing.
Inheritance: Inheritance is the process in which one class acquire the properties of another class.The concept of inheritance provides the idea of reusability.
Polymorphism: Polymorphism is a process in which a class has all the state and behavior of another class. Polymorphism thus in short defined as “ same name, many form”. More precisely, Polymorphism in object-oriented programming is the ability of objects belonging to different data types to respond to calls of method of the same name, each one according to an appropriate type-specific behavior.Polymorphism is achieved using operator overloading and function overloading. For example, “+” operator can be used for concatenation of two strings.
Message passing: The process by which an object sends data to another object or asks the other object to invoke a method or function is referred as message passing.
Object oriented languages:
Some object oriented languages are as given below:
Simula 67
Smalltalk
C++
Java
Applications of Object Technology:
Applications of Object Technology are as given below:
  • Real-time system
Simulation and modeling
Object oriented database
Hypertext, hypermedia
AI and expert system
Neural networks and parallel programming
CAM/CAD system
Object oriented database:
Object oriented database is designed to deal with complex data types such as address. Basically address consist of different subparts such as street-address, city, state, and postal code etc. Object oriented database is based on the three specific concepts of object oriented technology and that concepts are encapsulation, abstraction and inheritance.
Approaches to create object oriented database:
There are two approaches for creating object oriented database and are:
  1. Adding the concepts of object orientation to existing database languages.
  2. Extending existing object oriented languages to deal with databases.
Relational database management system (RDBMS):
Relational database management system (RDBMS) is a database management system (DBMS) that is based on the relational model /relational algebra.
RDBMS is the basis for SQL, and for all modern database systems like MS SQL Server, IBM DB2, Oracle, MySQL, and Microsoft Access.
The data in RDBMS is stored in database objects called tables .A table is the most common and simplest form of data storage in a relational database.The table is a collection of related data entries and it consists of columns and rows.
Object oriented database management system(OODBMS):
When database capabilities are combined with object-oriented (OO) programming language capabilities, the result is an object database management system (OODBMS).
Information today includes not only data but video, audio, graphs, and photos which are considered complex data types. Relational DBMS aren’t natively capable of supporting these complex data types.
As the usage of web-based technology increases with the implementation of Intranets and extranets, companies have a vested interest in OODBMS to display their complex data.
Thus, OODBMS use exactly the same model as object-oriented programming languages.
For more information on this topic refer following link:


(
for more information refer the links as given below:
http://rtmnupervasivecomp.blogspot.com
http://rtmnuittrends.blogspot.com
http://www.rtmnunetworkingtechnology.blogspot.com
)

Sunday, February 6, 2011

Nagpur university syllabus (Revised) MBA 2 nd sem IT

Paper-II: Internet Technologies & Trends

SECTION-A
Unit I: Internet - Working of Search Engines: Yahoo, Google, Dogpile and Met Crawler, Mailing: Authorization, Working of Rediffmail, Hotmail and Yahoomail, Chatting: RLC Concept, Video Conferencing.

Unit II: IT Trends - Worldwide Computer and Internet Use, Wireless Communication, IT Transforming our Values, Lives and Work, Maturity of IT Related Industries, Nanotechnology, Increasing demand of skilled workers, India’s Future in response to this changes.

Unit III: Data Mining - Concept, Terminology, Functions, Applications, Types (Text, Concept, Graph, Sequence, Tree), Techniques, Software.

Unit IV: Data Warehousing - Concept, History, Storage Methods, Success Parameters, Software Evaluation, Architecture, Developing Strategy, Use in Strategic Decision Making, Maintenance Issues, Web Data Analysis.

Unit V: Knowledge Management - Concept, Need, History, Approaches, Challenges, Supporting Technologies, Related Business Strategies, Chief Knowledge Officer, Emerging Perspectives, Relation to SNA (Social Network Analysis).


SECTION-B

Unit VI: E-Learning - Categories (Library / Bookshop, Showcase, Product & Services, Events), Virtual Classrooms, E-Learning in Education, Government and Telecom, Trends in e-Learning.

Unit VII: e-Governance - Need, Scope and Challenges for e-Governance applications, Success stories from India (ap-it.com), huge value addition by citizen centric e-Governance applications.

Unit VIII: e-Business - Architecture, Digital Marketing Strategy, Digital Productivity, IT Products and Services, Interdependence of Security and the Extended Enterprise, e-Business for SME, Organic Growth.

Unit IX: Evolution of e-Commerce - Historical Development, Success Factors, Working, Market Size, Trends, Strategies: Yahoo, Google, MySpace, eBay, Comparison of e-Commerce Solutions: B2B and B2C, M-Commerce.

Unit X: Role of IT in different verticals - Banking, Financial Service and Insurance (BFSI): TCS, Infosys and Wipro, E-Tailing /Retail: TCS, Telecom: TechMahindra and Telecom Operators: Airtel, Reliance Infocomm, Hutch, BSNL, Idea, Spice. Case studies of important portals: Jobs: Timesjobs, Monster, Naukari, Matrimony: Shadi.com, Auction: eBay, Books: - Amazon, Financial Information: MoneyControl, EasyMF, Media: Indiatimes, Yahoo and Google.


Suggested Readings:
1. Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management by Michael J. A. Berry
2. Michael Allen's E-Learning Library: Creating Successful E-Learning : A Rapid System For Getting It Right First Time, Every Time (Michael Allen's E-Library) by Michael W. Allen
3. Harvard Business Review on Corporate Governance (Harvard Business Review Paperback Series) by Walter J. Salmon, Jay William Lorsch, Gordon Donaldson, and John Pound
4. E-Commerce: Business, Technology, Society (3rd Edition) by Kenneth Laudon and Carol Traver
5. Knowledge Management by Carl Frappaolo


Dear friends,

I have uploaded 3rd unit from the syllabus . You will find a detail notes in terms of one PDF file and one power point presentation on unit 3rd. For this , link is given below.

Data Mining

You can go through the blogs and tell me if you want more information. for more notes you can refer following links:

  1. http://rtmnupervasivecomp.blogspot.com/
  2. http://rtmnuittrends.blogspot.com/
  3. http://www.rtmnunetworkingtechnology.blogspot.com/



Preeti

Decision support System

A Decision Support Systems (DSS) is a class of information systems that supports business and organizational decision-making activities. A properly designed DSS is an interactive software-based system intended to help decision makers compile useful information from a combination of raw data, documents, personal knowledge, or business models to identify and solve problems and make decisions. Typical information that a decision support application might gather and present are:
  • an inventory of all of your current information assets including legacy and relational data sources, cubes, data warehouses, and data marts.
  • comparative sales figures between one week and the next.
  • projected revenue figures based on new product sales assumptions.
There are four fundamental components of a DSS and are:
  1. Inputs: Factors, numbers, and characteristics to analyze
  2. User Knowledge and Expertise: Inputs requiring manual analysis by the user
  3. Outputs: Transformed data from which DSS "decisions" are generated
  4. Decisions: Results generated by the DSS based on user criteria

Development Frameworks:

DSS technology levels of hardware and software may include:
  1. Generator contains Hardware/software environment that allows people to easily develop specific DSS applications. This level makes use of case tools or systems such as Crystal, AIMMS, and iThink.
  2. Tools include lower level hardware/software. DSS generators including special languages, function libraries and linking modules
An iterative developmental approach allows for the DSS to be changed and redesigned at various intervals. Once the system is designed, it will need to be tested and revised for the desired outcome.

Classification of DSS frameworks:

DSS is classified into the following six frameworks:

A compound DSS is the most popular classification for a DSS. It is a hybrid system that includes two or more of the five basic frameworks
The support given by DSS can be separated into three distinct, interrelated categories:

Intelligent Decision Support Systems:
DSSs which perform selected cognitive decision-making functions and are based on artificial intelligence or intelligent agents technologies are called Intelligent Decision Support Systems
The field of Decision engineering treats the decision itself as an engineered object, and applies engineering principles such as Design and Quality assurance to an explicit representation of the elements that make up a decision.

Benefits of DSS:
Some benefits of DSS are as given below:
  1. Improves personal efficiency
  2. speed up the progress of problems solving in an organization
  3. Facilitates interpersonal communication
  4. Promotes learning or training activities
  5. Increases organizational control
  6. Generates new support for decisions
  7. Creates a competitive advantage over competition
  8. Encourages searching and discovery on the part of the decision maker
  9. Reveals new approaches to thinking about the problem space
  10. Helps mechanize the managerial processes.


(


This article is the topic of 7th unit from RTMNU MBA 3 rd sem IT syllabus notes.Further topics will be covered in upcoming blogs For more notes you can also refer to other links as given below:

  1. http://rtmnupervasivecomp.blogspot.com
  2. http://rtmnuittrends.blogspot.com
  3. http://www.rtmnunetworkingtechnology.blogspot.com


)


Embedded System

Embedded System

An embedded system is system which is designed to perform one or a few dedicated functions, often with real-time computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. Embedded systems control many of the common devices in use today. Embedded systems are controlled by a main processing core that is typically either a microcontroller or a digital signal processor (DSP).

Since the embedded system is devoted to specific tasks, design engineers can optimize it, reducing the size and cost of the product, or increasing the reliability and performance. Some embedded systems are mass-produced, benefiting from economies of scale.

Physically, embedded systems range from portable devices such as digital watches and MP3 players, large stationary installations like traffic lights, factory controllers or the systems controlling nuclear power plants. Complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large framework or enclosure.

Characteristics:

1. Embedded systems are designed to do some specific tasks, rather than be a general-purpose computer for multiple tasks often with real-time computing constraints.

2. Embedded systems are not always standalone devices. Many embedded systems consist of small, computerized parts within a larger device that serves a more general purpose.

3. The program instructions written for embedded systems are referred to as firmware, and are stored in read-only memory or Flash memory chips. They run with limited computer hardware resources such as little memory, small or non-existent keyboard and/or screen.

Tools:

Embedded system designers use compilers, assemblers, and debuggers to develop embedded system software. However, they may also use some more specific tools such as emulators.For systems using digital signal processing, developers may use a math workbench such as Scilab / Scicos, MATLAB / Simulink, EICASLAB, MathCad, or Mathematica to simulate the mathematics. They might also use libraries for both the host and target which eliminates developing DSP routines as done in DSPnano RTOS and Unison Operating System.Custom compilers and linkers may be used to improve optimisation for the particular hardware.An embedded system may have its own special language or design tool, or add enhancements to an existing language such as Forth or Basic. Designers can also add a Real-time operating system or Embedded operating system, which may have DSP capabilities like DSPnano RTOS.

Software tools can come from several sources:

  • Software companies that specialize in the embedded market
  • Ported from the GNU software development tools
  • Sometimes development tools for a personal computer can be used if the embedded processor is a close relative to a common PC processor

As the complexity of embedded systems grows, higher level tools and operating systems are migrating into machinery where it makes sense. For example, cell phones, personal digital assistants and other consumer computers often need significant software that is purchased or provided by a person other than the manufacturer of the electronics. In these systems, an open programming environment such as Linux, NetBSD, OSGi or Embedded Java is required so that the third-party software provider can sell to a large market.

CPU platforms:

Embedded processors can be broken down into two broad categories: ordinary microprocessors (μP) and microcontrollers (μC), which have many more peripherals on chip, reducing cost and size. Contrasting to the personal computer and server markets, a fairly large number of basic CPU architectures are used; there are Von Neumann as well as various degrees of Harvard architectures, RISC as well as non-RISC and VLIW; word lengths vary from 4-bit to 64-bits and beyond mainly in DSP processors, although the most typical remain 8/16-bit. Most architecture come in a large number of different variants and shapes, many of which are also manufactured by several different companies.


(


This article is the topic of 7th unit from RTMNU MBA 3 rd sem IT syllabus notes.Further topics will be covered in upcoming blogs For more notes you can also refer to other links as given below:

  1. http://rtmnupervasivecomp.blogspot.com
  2. http://rtmnuittrends.blogspot.com
  3. http://www.rtmnunetworkingtechnology.blogspot.com


)

Databases

Databases
A database is an integrated collection of logically related records or files. Databases can be classified according to types of content: bibliographic, full-text, numeric, and image. The data in a database is organized according to a database model. There are three different types of database models: Relational model, Hierarchical model and the Network model.
Database models:
1. Relational model: relational model represents data and relationship among data by collection of tables.
2. Hierarchical model: Hierarchical model represents data and relationship among data by records and links respectively.
3. Network model: Network model represents data and relationship among data in terms of graphs.

Types of Databases:

1.Operational database

Operational databases store detailed data needed to support the operations of the entire organization. They are also called subject-area databases (SADB) or transaction databases or production databases. These are all examples:
  • Customer databases
  • Personal databases
  • Inventory databases

2.Analytical database

Analytical databases stores data and information extracted from selected operational and external databases. They consist of summarized data and information most needed by an organizations manager and other end user. They may also be called Management database or Information database.

3.Data warehouse

A data warehouse stores data from current and previous years that has been extracted from the various operational databases of an organization. It is the central source of data that has been screened, edited, standardized and integrated so that it can be used by managers and other end user professionals throughout an organization.

4.Distributed database

Distributed databases are databases of local work groups and departments at regional offices, branch offices, manufacturing plants and other work sites. These databases can include segments of both common operational and common user databases, as well as data generated and used only at a user’s own site.

5.End-user database

End-user databases consist of a variety of data files developed by end-users at their workstations. Examples of these are collection of documents in spreadsheets, word processing and even downloaded files.

6.External database

External databases where access to external, privately owned online databases or data banks is available for a fee to end users and organizations from commercial services. Access to a wealth of information from external database is available for a fee from commercial online services and with or without charge from many sources in the internet.

7.Hypermedia databases

Hypermedia databases are set of interconnected multimedia pages at a web-site. It consists of home page and other hyperlinked pages of multimedia or mixed media such as text, graphic, photographic images, video clips, audio etc.

8.Navigational database

Navigational databases are characterized by the fact that objects in it are found primarily by following references from other objects. Traditionally navigational interfaces are procedural, though one could characterize some modern systems like XPath as being simultaneously navigational and declarative.

9.In-memory databases

In-memory databases are database management systems that primarily rely on main memory for computer data storage. Main memory databases are faster than disk-optimized databases since the internal optimization algorithms are simpler and execute fewer CPU instructions. Accessing data in memory provides faster and more predictable performance than disk. In applications where response time is critical, such as telecommunications network equipment that operates 9-1-1 emergency systems, main memory databases are often used.

10.Document-oriented databases

Document-oriented databases are computer programs designed for document-oriented applications. These systems may be implemented as a layer above a relational database or an object database. As opposed to relational databases, document-based databases do not store data in tables with uniform sized fields for each record. Instead, each record is stored as a document that has certain characteristics. Any number of fields of any length can be added to a document. Fields can also contain multiple pieces of data.

11.Real-time databases

A real-time database is a processing system designed to handle workloads whose state is constantly changing. This differs from traditional databases containing persistent data, mostly unaffected by time. For example, a stock market changes very rapidly and is dynamic. Real-time processing means that a transaction is processed fast enough for the result to come back and be acted on right away. Real-time databases are useful for accounting, banking, law, medical records, multi-media, process control, reservation systems, and scientific data analysis. As computers increase in power and can store more data, they are integrating themselves into our society and are employed in many applications.
Database Management System:
A database management system (DBMS) is software that organizes the storage of data. It controls the creation, maintenance, and use of the database storage structures of an organization and its end users. It allows organizations to place control of organization wide database development in the hands of Database Administrators (DBAs) and other specialist. In large systems, a DBMS allows users and other software to store and retrieve data in a structured way.

Primary tasks of DBMS:

  • Database Development: It is used to define and organize the content, relationships, and structure of the data needed to build a database.
  • Database Interrogation: It can access the data in a database for information retrieval and report generation. End users can selectively retrieve and display information and produce printed reports and documents.
  • Database Maintenance: It is used to add, delete, update, correct, and protect the data in a database.
  • Application Development: It is used to develop prototypes of data entry screens, queries, forms, reports, tables, and labels for a prototyped application. Or use 4GL or 4th Generation Language or application generator to develop program codes.


(

This article is the topic of 7th unit from RTMNU MBA 3 rd sem IT syllabus notes.Further topics will be covered in upcoming blogs For more notes you can also refer to other links as given below:

)


Saturday, February 5, 2011

Virtual Reality


Virtual Reality
Virtual Reality (VR) is basically allows a user to interact with a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world. Most current virtual reality environments are primarily visual experiences, displayed on a computer screen, but some simulations include additional sensory information, such as sound through speakers or headphones. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove. The simulated environment can be similar to the real world, In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due largely to technical limitations on processing power, image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time.Virtual Reality is often used to describe a wide variety of applications, commonly associated with its immersive, highly visual, 3D environments.
Michael Heim identifies seven different concepts of Virtual Reality as given below:
1. Simulation
2. Interaction
3. Artificiality
4. Immersion
5. Telepresence
6. Full-body immersion
7. Network communication.
Technology:
To run the most basic VR system one must have the following:
• one or more powerful computers
• sensors or input devices
• display arrangements
• virtual environment rendering software
Types of VR system:
There are different types of VR systems, but most can be classified into one of the following three categories:Desktop VR, Video Mapping VR, and Immersive VR.


Impact:

Virtual reality has great impact on human life and activity.
  • Virtual reality integrates daily life and activity. It influences human behavior, interpersonal communication, and cognition (i.e., virtual genetics).
  • As we spend more and more time in virtual space, it results into important changes in economics, worldview, and culture.
  • The design of virtual environments may be used to extend basic human rights into virtual space, to promote human freedom and well-being, and to promote social stability as we move from one stage in socio-political development to the next.


Implementation:
To develop a real time virtual environment, a computer graphics library can be used as embedded resource coupled with a common programming language, such as C++, Perl, Java or Python. Some of the most popular computer graphics library/API/language are OpenGL, Direct3D, Java3D and VRML, and their use will be directly influenced by the system demands in terms of performance, program purpose, and hardware platform. The use of multithreading can also accelerate 3D performance and enable cluster computing with multi-user interactivity.

Manufacturing:

Virtual reality can provide new product design, helping as an ancillary tool for engineering in manufacturing processes, new product prototype and simulation. Among other examples, we may also quote Electronic Design Automation, CAD, Finite Element Analysis, and Computer Aided Manufacturing. The use of Stereo lithography and 3D printing shows how computer graphics modeling can be applied to create physical parts of real objects used in marine, aerospace and automotive industry. Beyond modeling assembly parts, 3D computer graphics techniques are currently used in the research and development of medical devices for innovative therapies, treatments, patient monitoring, and diagnosis of complex diseases.
Pros and Cons:
The following table gives some of the major arguments for and against of a virtual reality:



Future:

Computing Power is continuously increasing. If this is the case then soon we will have a computer powerful enough to run immersive VR programs in our own homes by the year 2037 with the help of advancements in nanotechnology and quantum computing. But then also virtual reality has to face some challenges. Virtual reality has been heavily criticized for being an inefficient method for navigating non-geographical information. At present, the idea of ubiquitous computing is very popular as the aim of ubiquitous computing is to bring computer into user’s world rather than forcing the user to interact with computers. Thus it may create threats for virtual reality.

(

This article is the topic of 7th unit from RTMNU MBA 3 rd sem IT syllabus notes.Further topics will be covered in upcoming blogs For more notes you can also refer to other links as given below:

)







Thursday, February 3, 2011

Data Mining

Data Mining



Data mining is an umbrella term that can be applied to a number of varying activities. In corporate world, data mining is used to determine the direction of trends and predict the future. It uses large amount of computing power for its operation. Data mining is popular in the fields of mathematics and science but is now increasingly used by marketers also. Data mining is also known as Knowledge-Discovery in Databases(KDD). Parameters of Data mining are: Association, Sequencing, Classification, Clustering and Forecasting. Data mining applications are available on all size systems for mainframe, client/server and PC platforms. Relational database storage and management technology is adequate for many data mining applications less than 50 gigabytes. Data mining process consist of six steps and are data cleaning, data mart, derived attributes, modeling, post-processing and deployment. Data miners uses some data mining techniques like near-neighbor models, k-means clustering, decision tree and/or k-fold cross validation etc. There are four basic model of data mining and are Predictive Model, Summary Model, Network Model and Association model. General mining software used for mining are Mil Shield, ADAPA Predictive Analytics, Younicycle, STATISTICA Data Miner, VisuMap and many more.
For detail information regarding parameters, process, techniques and models of data mining mentioned above please refer the following files. It includes information about application areas of data mining as well:


(
for more information refer the links as given below:



http://rtmnupervasivecomp.blogspot.com
http://rtmnuittrends.blogspot.com
http://www.rtmnunetworkingtechnology.blogspot.com

)

Sunday, January 30, 2011

Communication Technologies-II (Unit 4)

Nagpur university syllabus (Revised) MBA 3 rd sem IT


Paper -III: Innovations in IT

SECTION A

Unit I: IT Enabled Services ((ITeS): Outsourcing - India as Ideal Destination, India Outsourcing History, Outsourcing Writing to India, Call Centers in India, Multilingual Call Centers, Voice/Non-Voice ITeS (BPO Services), HIPAA Compliance in India, Outsourcing Engineering Services, Radiology and Intellectual Property to India. BPO: BPO Concept, Offshoring, Nearshoring, Homeshoring, Medical / Legal Transcription, Back-Office Accounting, Insurance Claims, Credit Card Processing, BPO in India, BPO Security, BPO in India - Legal Issues
.
Unit II: Ntewroking Technology &Systems (NeTS) - Next Generation Multi-service Networks, Future INternet Design (FIND), IP Telephony (IPT): IPT Components, Soft Phones, Wireless IP Phones, Voice Gateways, Inter-cluster Call, Telco Signaling Protocols, VoIP, VoIP Protocols, Large-Scale IPT and Voice-Mail Network: Voice Network Architecture, Overview: Network Planning and Designing.

Unit III: Communication Technologies-I - Next Generation Mobile Networks, Heterogeneous Networks, Ad-Hoc & Sensor Networks, Wireless Networks: WiFi, WiMax, Cellular, 3G/4G.

Unit IV: Communication Technologies-II - Mobility Management and Mobile Computing, Technology Convergence: GSM/CDMA/TDMA, Quality of Service Issues, Network Security and Privacy, Grid Computing and Clustering, Mobile TV, MMIT.

Unit V: Web Applications and Services-I - Internet Services and Applications, Web Services, Internet Computing, E-Learning , Middleware , Web Information Systems.


SECTION B
Unit VI: Web Applications and Services-II - Web Based Software, Semantic Web, Agent-Oriented Computing, E-Business, E-Commerce & E-Government, Ontology Engineering, Portal Technologies.

Unit VII: Computing and Information Systems - Advanced Computer Architectures, Virtual Reality, Databases & Data Mining, Agile Information Systems, AI & DSS, High Performance & Cluster Computing, Real-Time and Embedded Systems, Information Systems Integration , Geographical Information Systems, Business Process Modeling.

Unit VIII: Pervasive and Ubiquitous Computing-I - Smart Appliances & Wearable Computers, Inter-Vehicular Communication, Personal Computing, Pervasive Wireless Networking, Opportunistic Systems, Ubiquitous Health Care.

Unit IX: Pervasive and Ubiquitous Computing-II - Ubiquitous Computing, Location-Based Services, Educational Gaming & Instructional Technologies, Context-Aware Environments and Devices, Personal Broadcasting, Autonomic Systems.

Unit X: IT Trends - Biometrics, Fuzzy Logic & Neural Networks, Organic Growth, Audio/Visuals: mp3, mpeg and IPOD, General Outline of IT Act’2000, Case Studies: Mobile Industry Market Players: Nokia, Motorola, Sony-Ericson, Samsung and LG. GIS: Google Earth, E-Learning: Zee TV, E-Governance: Andhra Pradesh, Gadgets: Apple Store, Networking: Cisco.


Suggested Readings:
1. Offshore Ready: Strategies to Plan & Profit from Offshore IT-enabled Services by Stuart Morstead
2. Networking Infrastructure for Pervasive Computing: Enabling Technologies and Systems by Debashis Saha, Amitava Mukherjee, and Somprakash Bandyopadhyay
3. Introduction to Mobile Communications: Technology, Services, Markets (Informa Telecoms & Media) by Tony Wakefield, Dave McNally, David Bowler, and Alan Mayne
4. iPod & iTunes: The Missing Manual, Fourth Edition by Jude Biersdorfer
5. Developing Web Services for Web Applications: A Guided Tour for Rational Application Developer and WebSphere Application Server (IBM Illustrated Guide Series) by Colette Burrus and Stephanie Parkin

Dear friends,

I have uploaded the 4th unit topics from the syllabus in my blogs.

Unit 4 -
you can go through the blogs and tell me if you want more information.

some other unit topics are also uploaded by my friends
links are given below


  1. http://rtmnupervasivecomp.blogspot.com/
  2. http://rtmnuittrends.blogspot.com/
  3. http://www.rtmnunetworkingtechnology.blogspot.com/