Monday, April 20, 2015

CHAPTER 7

CHAPTER 7

WHAT IS TELECOMMUNICATION, NETWORKING AND BUSINESS WORLD

  • Telecommunication refers to the exchanges of information by electronic and electrical means over a significant distance.
  • Networking is defined as the act of making contact and exchanging information with other people, groups and institutions to develop mutually beneficial relationship, or to access and share information between computers. 
  • Business World is regular production or purchase and sale of goods undertaken with an objective of earning profits and acquiring wealth through the satisfaction of human wants. 

3 TYPES OF NETWORKS 

A) LAN 
B) MAN 
C) WAN 

WHAT IS INTERNET 

- Internet is a connecting computer to any other computer anywhere in the world via dedicated routers and serves. When two computers are connected over the internet, they can send and receive all kinds of information such as text, graphics, voice, video and computer programs. 

CLIENT AND SERVER 

- A client send request to a server, according some protocol, asking for information or action, and the server responds. This is analogous to a customers (client) who sends an order (request) on an order form to a supplier (server) who despatches the goods and an invoice. The order form and invoice are part of the "protocol" used to communicate.



WHAT IS SEARCH ENGINES

- A software program that searches a database and gathers and reports information that contains or is related to specified terms. 

- A website whose primary functions is providing a search engine for gathering and reporting information available on the internet or a portion of the internet. 

  

HYPERTEXT
 (HTML)

Web pages are based on a standard Hypertext Markup Language (HTML).
HYPERTEXT TRANSFER PROTOCOL (HTTP)

(HTTP) is the communications standard used to transfer pages on the Web.
UNIFORM RESOURCES LOCATOR( URL)

The directory path and document name are two more pieces of information within the web address that help the browser track down the requested pages. This address called a uniform resources locator( URL)

WIRELESS COMPUTER NETWORKS AMD INTERNET ACCESS 

a) bluetooth

- Bluetooth is the popular wireless standard, which is useful for creating small personal area network (PANs). It link up to eight devices within a 10 meter area using lower-power, radio-based communication and can transmit up to 722 Kbps in the 2.4GHz band. 


b) Wifi and Wireless Internet Access 

-Wifi is a wireless networking technology that allows computers and other devices to communicate over a wireless signal. And also Wi-Fi is the standard way computers connect to wireless networks. However, Wi-Fi was used in place of only the 2.4GHz  802.11b standard.



c) WiMax 

- WiMax is the on hottest broadband wireless technologies around today. Wimax system are expected to deliver broadband access services to residential and enterprise customers in an economical way. And also WiMax is a standardized wireless version of Ethernet intended primarily as an alternative to wire technologies (such as Cable modems, ) to provide broadband access to customer premises. 






Sunday, April 19, 2015

CHAPTER 6

FOUNDATION OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT


ORGANIZING DATA IN A TRADITIONAL FILE ENVIRONMENT

File organization concepts

• Record: Group of related fields
• File: Group of related files 
• Field: Group of characters as word(s) or number
        - Describes an entity (person, place, thing on which we store information
       - Attribute: Each characteristic, or quality, describing entity
       - Example: Attributes Date or Grade belong to entity Course

  The Data Hierarchy

A computer system organizes data in a hierarchy that starts with the bit, which represents either a 0 or a 1. Bits can be grouped to form a byte to represent one character, number, or symbol. Bytes can be grouped to form a field, and related fields can be grouped to form a record. Related records can be collected to form a file, and related files can be organized into a database.

Problems with the traditional file processing (filesmaintained separately by different departments)

Data redundancy: Presence of duplicate data in multiple files. - Data inconsistency: Same attribute has different values
- Program-data dependence:
           •When changes in program requires changes to data accessed by program
      - Lack of flexibility
      - Poor security
      - Lack of data sharing and availability

     THE DATABASE APPROACH TO DATA MANAGEMENT

 •Database:
Collection of data organized to serve many applications by centralizing data and controlling    redundant data

Database management system:
Interfaces between application programs and physical data files
- Separates logical and physical views of data
- Solves problems of traditional file environment
           i) Controls redundancy
          ii) Eliminated inconsistency
         iii) Uncouples programs and data
         iv) Enables central management and security


     •Relational DBMS
     - Represent data as two-dimensional tables called relations or files
     - Each table contains data on entity and attributes

     •Table: Grid of columns and rows
     Rows (tuples): Records for different entities
     - Fields (columns): Represents attribute for entity
     - Key field: Field used to uniquely identify each record
     - Primary key: Field in table used for key fields
   - Foreign key: Primary key used in second table as look-up field to identify records from original table  

     •Operations of a Relational DBMS: Three basic operations used to develop useful sets of data
       i) SELECT: Creates subset of data of all records that meet stated criteria
      ii) JOIN: Combines relational tables to provide user with more information than available in individual tables
      iii) PROJECT: Creates subset of columns in table, creating tables with only the information specified

     •Capabilities of Database Management Systems
     - Data definition capability: Specifies structure of database content,used tcreate tables and   define characteristics of fields.
     - Data dictionary: Automated or manual file storing definitions of data elements and their       
       characteristics.
     - Data manipulation language: Used to add, change, delete, retrieve data from database
          •Structured Query Language (SQL)
          •Microsoft Access user tools for generation SQL
    - Many DBMS have report generation capabilities for creating polished reports
      (Crystal Reports)

     •Designing Databases
      - Conceptual (logical) design: abstract model from business perspective
      - Physical design: How database is arranged on direct-access storage devices

      •Design process identifies
       - Relationships among data elements, redundant database elements
       - Most efficient way to group data elements to meet business requirements,           
         needs of application programs

      •Normalization
      - Streamlining complex groupings of data to minimize redundant data elements 
        and awkward many-to-many relationships

      •Business intelligence infrastructure
       - Today includes an array of tools for separate systems, and big data

     • Contemporary tools-

       i) Data warehouses:
          •Stores current and historical data from many core operational transaction systems
          •Consolidates and standardizes information for use across enterprise, but data 
           cannot be altered
          •Data warehouse system will provide query, analysis, and reporting tools

       ii) Data marts:
           •Subset of data warehouse with summarized or highly focused portion of firm’s data for   
            use by specific population of users
          •Typically focuses on single subject or line of business

      iii) Hadoop:
          • Enables distributed parallel processing of big data across inexpensive computers.
          • Key services
              - Hadoop distributed file system (HDFS): data storage
              - Mapreduce: breaks data into clusters for work
              - Hbase: NoSQL database
          • Used by Facebook, Yahoo, NexBio

       iv) In-memory computing:
          • Used computers main memory (RAM) for data storage to avoid delays in retrieving data         from disk storage.
          • Used in big data analysis.
          • Can reduce hours/days of processing to seconds 
          • Requires optimized hardware.

       v) Analytical platforms:
- High-speed platforms using both relational and non-relational tools optimized for large datasets. 


•Analytical tools: Relationships, patterns, trends
- Tools for consolidating, analyzing, and providing access to vast amounts of data to help 
   users make better business decisions.

i) Multidimensional data analysis (OLAP):
 - Online analytical processing (OLAP)
   •Supports multidimensional data analysis
   •Enables viewing data using multiple dimensions
    •Each aspect of information (product, pricing, cost, region, time period) is 
     different dimension
   •Example: How many washers sold in East in June compared with others 
     regions? 
 - OLAP enables rapid, online answers to ad hoc queries

ii) Data Mining: 
  -Finds hidden patterns, relationships in large databases
  -Infers rules to predict future behavior
  -The patterns and rules are used to guide decision making and forecast the
   effect of those decisions
  -Popularly used to provide detailed analyses of patterns in customer data for 
   one-to-one marketing campaigns or to identify profitable customers.
  -Less well known: used to trace calls from specific neighborhoods that use 
   stolen cell phones and phone accounts.
   - Types of information obtainable from data mining:
      •Associations: Occurrences linked to single event
      •Sequences: Events linked over time
      •Classification: Recognizes patterns that describe group to which item    
        belongs
      •Clustering: Similar to classification when no groups have been defined;    
        finds groupings within data
      •Forecasting: Uses series of existing values to forecast what other 
       values will be

iii) Text Mining:
   - Extracts key elements from large unstructured data sets.
     • Stored e-mails
     • Call center transcripts
     • Legal cases
     • Patent descriptions
     • Service reports, and so on
   - Sentiment analysis software
     • Mines e-mails, blogs, social media to detect opinions.

iv) Web Mining:
   - Discovery and analysis of useful patterns and information from web.
     • Understand customers behavior.
     • Evaluate effectiveness of web site, and so on.
   - Web content mining.
     • Mines content of Web page
   - Web structure mining
     • Analyzes links to and from web page
   - Web usage mining
     • Mines user interaction data recorded by web serve.

v) Databases and the Web
   - Many companies use Web to make some internal databases available to customers or partners
   - Typical configuration includes:
          Web server
          Application server/middleware/CGI scripts
          Database server (hosting DBM)
   - Advantages of using Web for database access:
          Ease of use of browser software
          Web interface requires few or no changes to database
          Inexpensive to add Web interface to system



Friday, April 17, 2015

CHAPTER 5


IT INFRASTRUCTURE AND EMERCING TECHNOLOGIES

Defining IT Infrastructure
Refers to the composite hardware, software, network resources and services required for the existence, operation and management of an enterprise IT environment. It allows an organization to deliver IT solutions and services to its employees, partners and/or customers and is usually internal to an organization and deployed within owned facilities.


Fig. Connection Between The Firm, IT Infrastructure, and Business Capabilities


Evolution of IT Infrastructure

The IT infrastructure in organizations today is an outgrowth of over 50 years of evolution in computing platforms. There have been five stages in this evolution, each representing a different configuration of computing power and infrastructure elements. The five eras are general-purpose mainframe and minicomputer computing, and cloud and mobile computing.

General-Purpose Mainframe and Minicomputer Era: (1959 to Present)
The introduction of the IBM 1401 and 7090 transistorized machines in 1959 marked the beginning of widespread commercial use of mainframe computer. The mainframe era was a period of highly centralized computing under the control of professional programmers and systems operators (usually in a corporate data center), with most elements of infrastructure provided by a single vendor, the manufacturer of the hardware and the software. This pattern began to change with the introduction of minicomputers produced by Digital Equipment Corporation (DEC) in 1965. In recent years, the minicomputer has evolved into a midrange computer or midrange server and is part of a network.
Personal Computer Era: (1981 to Present)
The appearance of the IBM PC in 1981 is usually considered the beginning of the PC era because this machine was the first to be widely adopted by American businesses. The Wintel PC computer (Windows operating system software on a computer with an Intel microprocessor) became the standard desktop personal computer. Today, 95 percent of the world’s estimated 1.5 billion computers use the Wintel standard.
Client/Server Era: (1982 to Present)
In client/server computing, desktop or laptop computers called clients are networked by powerful server computers that provide the client computers with a variety of services and capabilities. Simple client/server networks can be found in small businesses, most corporations have more complex, multitiered (often called N-tier) client/server architectures in which the work of the entire network is balanced over several different levels of servers, depending on the kind of service being requested.
At first level, a Web server will serve a Web page to a client in response to a request for service. Application server software handles all application operations between a user and an organization’s back-end business systems.  Novell NetWare was the leading technology for client/server networking at the beginning of the client/server era. Today, Microsoft is the market leader with its Windows operating systems.
Enterprise Computing Era: (1992 to Present)
In the early 1990s, firms turned to networking standards and software tools that could integrate disparate networks and applications throughout the firm into an enterprise-wide infrastructure. The enterprise infrastructure also requires software to link disparate applications and enable data to flow freely among different parts of the business, such as enterprise applications.
Cloud and Mobile Computing Era: (2000 to Present)
The growing bandwidth power of the Internet has pushed the client/server model one step further, towards what is called the “Cloud Computing Model,” refers to a model of computing that provides access to a shared pool of computing resources over a network, often the Internet.


Fig. stages in IT Infrastructure evolution
Technology Drivers of Infrastructure Evolution
Moore’s Law and Microprocessing Power

The first microprocessor chip was introduced in 1959, the number of components on a chip with the smallest manufacturing costs per component (generally transistors) had doubled each year. This assertion became the foundation of Moore’s Law.  This law would later be interpreted in multiple ways. There are at least three variations of Moore’s Law, none of which Moore ever stated: (1) the power of microprocessors doubles every 18 months (2) computing power doubles every 18 months; and (3) the price of computing falls by half every 18 months.


Fig. Moore's Law and Microprocessor Performance


Nanotechnology 

Uses individual atoms and molecules to create computer chips and other devices that are thousands of times smaller than current technologies permit. Nanotubes are tiny tubes about 10 000 times thinner than a human hair. They consist of rolled-up sheets of carbon hexagons and have potential uses as minuscule wires or in ultrasmall electronic devices and are very powerful conductor of electrical current.


Fig. Examples of Nanotubes


The Law of Mass Digital Storage

A second technology driver of IT Infrastructure change is the Law of Mass Digital Storage. The amount of digital information is roughly doubling every year ( Gantz and Reinsel,2011 ; Lyman and Varian,2003 ). Fortunately, the cost of storing digital information is falling at an exponential rate of 100 percent a year.



Fig. Falling Cost of Chips


Operating System Platforms
At the client level, 90 percent of PCs use some form of Microsoft Windows operating system to manage the resources and activities of the computer. Google’s Chrome OS provides a lightweight operating system for cloud computing using netbooks. Android is a mobile operating system developed by Android, Inc. and later the Open Handset Alliance as a flexible, upgradeable mobile device platform. Multitouch interface, where users use their fingers to manipulate objects on the screen.
Enterprise Software Applications

The largest providers of enterprise application software are SAP and Oracle (which acquired PeopleSoft).  Microsoft is attempting to move into the lower ends of this market by focusing on small and medium-sized businesses that have not yet implemented enterprise applications.
Data Management and Storage

Enterprise database management software is responsible for organizing and managing the firm’s data so that they can be efficiently accessed and used. The leading database software providers are IBM (DB2), Oracle, Microsoft (SQL server), and Sybase (Adaptive Server Enterprise), which supply more than 90 percent of the U.S. database software marketplace. Storage area networks (SANS) connect multiple storage devices on a separate high-speed network dedicated to storage.


Fig. connect multiple storage.

Cloud Computing

Cloud computing is which firms and individuals obtain computer processing, storage, software, and other services as a pool of virtualized resources over a network, primarily the Internet. These resources are made available to users, based on their needs, irrespective of their physical location or the location of the users themselves. The U.S. National Institute of Standards and Technology (NIST) defines cloud computing as having the following essential characteristics
  • On-demand self-service – individual can obtain computing capabilities such as server time or network storage on their own.
  • Ubiquitous network access – individuals can use standard network and Internet devices, including mobile platforms, to access cloud resources.
  • Location independent resource pooling – Computing resources are pooled to serve multiple users, with different virtual resources dynamically assigned according to user demand. The user generally does not know where the computing resources are located.
  • Rapid elasticity – computing resources can be rapidly provisioned, increased, or decreased to meet changing user demand.
  • Measured service – charges for cloud resources are based on amount of resources actually used.

Fig. how cloud computing work.


Competitive Forces Model for IT Infrastructure Investment
Market demand for your firm’s services – make an inventory of the services you currently provide to customers, suppliers, and employees.
Your firm’s business strategy – analyze your firm’s five-year business strategy and try to assess what new services and capabilities will be required to achieve strategic goals.

Your firm’s IT strategy, infrastructure, and cost – Examine your firm’s information technology plans for the next five years and assess its alignment with the firm’s business plans.
Information technology assessment – is your firm behind the technology curve or at the bleeding edge of information technology? Both situations are to be avoided.

Competitor firm services – try to assess what technology services competitors’ offer to customers, suppliers, and employees.

Competitor firm IT infrastructure investments – benchmark your expenditures for IT infrastructure against your competitors. Many companies are quite public about their innovative expenditures on IT.


Fig. " How much should our firm spend on IT Infrastructure ?"

Tuesday, April 14, 2015

CHAPTER 4


CHAPTER 4


DEFINITION OF HARDWARE
- Hardware, in the computer world, refers to the physical components that make up a computer system.

DEFINITION OF SOFTWARE
-  Software consists of carefully-organized instructions and code written by programmers in any of various special computer languages.

SIX CATEGORIES OF HARDWARE

INPUT DEVICE
  • Enter information and commands.

OUTPUT DEVICE
  • Hear, see, or otherwise recognized the result of information.

STORAGE DEVICE
  • Store information for use at a later time.

PROCESSING
  • CPU  : Hardware that interprets and execute software and coordinates all hardware.
  • RAM : Temporary holdings area for information and software.

TELECOMMUNICATION DEVICE
  • Sent information to and received it from another person or computer in a network.

CONNECTING DEVICE 
  • Connect peripherals to computer, such as cable, port, and expansion boards.



TWO CATEGORIES OF SOFTWARE

APPLICATION SOFTWARE
  • To solve specific problem or perform specific task.

SYSTEM SOFTWARE
  • Handle task specific to technology management and coordinates the interaction of all technology device. 


CATEGORIES OF COMPUTER (BY SIZED)

PERSONAL DIGITAL ASSISTANT
  • Small handheld computer.

TABLET PC
  • Pen-based computer with the functionality.

NOTEBOOK COMPUTER
  • Small, portable, and fully functional.

DESKTOP COMPUTER (MICROCOMPUTER)
  • Most popular type of personal computer.

MINICOMPUTER
  • Meets needs of several people simultaneously in a small or medium size business.

MAIN FRAME COMPUTER
  • Meets needs of hundreds of people in a large business.

SUPER COMPUTER
  • Fastest, most powerful, and most expensive type of computer.