Академический Документы
Профессиональный Документы
Культура Документы
ANNEXURE IIIA
Submitted by
Santu Bhuyan
B. Tech-MBA(CSE)
Department of CSE/IT
Lovely school of Engineering
Lovely Professional University, Phagwara
DECLARATION
I hereby declare that the project work entitled (“Title of the project”) is an
authentic record of my
own work carried out at (Place of work) as requirements of Industry Internship
project for the
award of degree of ____________(Relevant Degree), Lovely Professional
University, Phagwara
under the guidance of (Name of Industry coordinator) and (Name of Faculty
coordinator), during
July to December 2011).
(Signature of student)
Santu Bhuyan
Registration no:10801097
Date: ___________________
Certified that the above statement made by the student is correct to the best of our
knowledge
and belief.
ACKNOWLEDGEMENT
It is a matter of great pleasure and pride for us to introduce the new web
application “E-Transport Management System” to entire transport system of any
organization as the existing transport system concern.
………………………………………………………………………………………
……………..
………………………………………………………………………………………
…..
We express our deep regards for Mr. Paresh Ch. Lahkar, Additional General
Manager (Assam Electronics Development Corporation LTD.](AMTRON)
who whole heartedly accommodate us in this project and patiently helped in
organizing the team work throughout the training period.
Finally, I pay affection for my friends cum colleagues “Mr. Rahul Lahkar” and
“Mr. Rupam Bhagawati” who keeps me inspiring and helped me in performing
the team work perfectly.
TABLE OF CONTENTS
1) Introduction 42
2) Objective 42
3) Drawback in Existing System 44
4) Proposed System 45
5) Preliminary Investigation 47
6) Process Logic 50
7) Feasibility Study 50
8) Software Engineering Paradigm 53
9) Software Requirements Specification 59
10) Functional Description 60
11) Error Messages 61
12) Data Flow Diagrams 62
13) E-R Diagram 62
14) Project Category 63
15) Designing of System 64
16) Development of Software 69
17) Input Screens 71
18) Output Screens 86
19) Reports 95
20) Coding 102
21) Development of Back-End 255
22) Developing Law’s of Software 256
23) Code Efficiency 257
24) Code Optimization 258
25) Validation Checks 258
26) Testing Technique and Strategies 259
Projects - Overview
AMTRON has been designated as the State Level Agency of Government of Assam for the
implementation of various state-level and national-level projects. These are namely, ARBAS,
ASWAN, CSC, SDC, etc.
Amtron is also the largest promoter of Open Source in the region. Amtron has developed key
strength in implementing solutions based on Open Source Technologies. The Electronic
Document Approval System developed in Amtron is highly appreciated as the major step
towards e-governance for which Amtron received the 'World is Open Award 2008' jointly
sponsored by SKOCH and REDHAT.
Implementation of turnkey projects for various clients is one of the revenue earning activities of
AMTRON. The projects are supported by strong hardware and software support teams.
Amtron has key expertise in GIS. Through its state of art GIS centre, services are being offered
to the government on various projects*.
IT Park MoU
CSC Overview
The whole transportation system is not well maintained previously where there was a huge work
load and imbalance in maintaining the safety and assurance. It was a system lacking in security
and time management. As a whole it was unreliable.
To overcome these all problems we come up with the new idea of developing a new, efficient and
reliable transport management system- The e-Transport Management System.
The purpose of this system is to manage and control company transport provided by employer to
employees especially in a IT/BPO, Manufacturing Industry, School etc.
The whole system is known as e-Transport Management System. This System will use the latest
technologies in order to manage the transportation system of an organization.
The various developing needs has resulted in the evolvement of this completely Computerised
Management System which will surely mange the System in an effective way and the reducing
the involvement of any manual and Hard copy work. New generation needs with effective
technologies will be seen in this New System.
Thus we will be Building a prototype of the New System which will be further developed and
modified according to the Needs of the Organization.
INTRODUCTION
E –Transport Management System is a network based application software project which can
efficiently capable of managing a business organization or educational institution as passenger
transport system. At present day most of the organization have their own transport system to
move their employers to their own work places which incense improve the performance of the
organization. It can be seen that in India most of the organization transport system is manually
managed. This system has much activity to manage properly which need an efficient work force.
In the system we have different activities such as managing the transport schedule, maintain
tender to the vendor, arranging vehicle with the driver , selecting route, maintaining the customer
records, collecting monthly fees, developing pass etc.
In most of the things at present manually managed in register which is very inefficient and time
consuming and costing affair, which need more man power to maintain the system to arrange
emergency plan the system shows their inefficiency.
E – Transport Management System is an online net based user-friendly system to overcome the
above drawbacks of the existing system.
Customer module
Employer module
Supervisor module
Vendor module
Consumer module: A user friendly interface to access transport schedule, time table in
different roots and vehicle availability. Alongwith the above mention accessibility of
information; this interface will have an information centre regarding the vehicle booking,
change of root and cancellation.
Employer module: To track whole management of the service this module will give a
facility to the owner of the system with user of the facility.
Supervisor module: Managing and maintaining the application of the transport is a job
that has to be performed by the supervisor.
Vendor module: This module will track an address, complain from customer or employer.
Employers :-
To track whole management of the service this module will give a facility to the owner of
the system with user of the facility. Along with maintaining the service quality this
module will also control the transport cost to regulate the scheme with exist scheme.
Interface facility include:-
Online payment to vendors’ facility
Provide request for addition or termination of vehicle for new scheme.
Maintaining drivers standard and emergency planning for different scheme.
Report generation:-
This module has very much importance in the project to generate different report; but this
is not a functional module.
Different report that is generating by this module:-
- Details of consumer list.
- Vehicle uses list.
- Driver manifest.
- Maps and direction.
- Details of new registration, renewal, cancellation etc.
- Root addition and termination report.
- Maintenance activity report.
- New and retiring vehicle report.
- Different invoice and expenditure report.
- Actual vs. projected users and roots report.
Existing Software
Existing system was totally manual thus a great amount of manual work has to
be done, with increase in the business of the company, automation was
necessary.
Major problem was the lack of security check that was must to be applied.
Finding out the detail regarding any information was very difficult.
Since the system was not-on-line, absence of any concerned person create
problem in case of any inquiry since the other person has to go through the
heavy books which is next to impossible.
In case of error, no one can help the user except that person who is handling that
portion of job.
Existing system has no systematic way since all the deal of a particular date
been posted into the concerned file as and when the deal occurs.
To convey any information regarding any changes made, through phone or post
the user to go through that thick file in order to get the address or the phone
number of the party.
As stated above DFD (Data Flow Diagram) cannot be produced in such Situation.
Complaint management.
Contingency Planning.
24 X 7 availability
PRELIMINARY INVESTIGATION:
When that request is made, the first system active, the preliminary
investigation begins. It was not a single person, who requested the writing &
implementation of the existing system, but it was a cumulative performance
feedback from the end user of the old user as well as forms the information
division employees.
FEASIBILITY STUDY
i) Technical feasibility
ii) Operational feasibility
iii) Economic feasibility
iv) Social feasibility
v) Management feasibility
vi) Legal feasibility
vii) Time feasibility
Technical feasibility:
give the complete picture about the system requirements. What speeds of input
and output should be achieved at particular quality of printing.
With the help of above support we remove defect of existing software. In future we can
easily switch over any plate form. To ensure that system does not halt in case of
undesired situation or events. Problem effected of any module does not effect any module
of the system. A change of hardware does not produce problem.
Operational Feasibility:
At present stage all the work is done by manual. So, throughput and response
time is too much. Major problem is lack of security check that was must to be
applied. Finding out the detail regarding any information was very difficult. In
case of any problem, no one can solve the problem until the master of this field
is not present.
I will not change the structure of organization. I will deliver a system that
will look like a current structure of organization. But the system, which is,
delivered by me removes all the overheads. All the computational work will be
done automatically in our system. Response time is very quick.
Economic feasibility:
hence the engineer will not find any difficulty at the installation time and after
installation user also newer find difficulty i.e. hang, slow speed or slow
response time. One project is compulsory for each student this project is either
dummy or live. If I am developing a live project then it gives a lot of
confidence. It is better for me and for company because, I am developing a
system with out any money. So every thing is in favor now, I can say the cost of
this software is I think negligible.
PROCESS MODELING:
After defining the data in data modeling phase. I have transformed the
data to achieve the necessary information for implementing a business function.
Processing descriptions are created for adding, modifying, deleting, or
retrieving a data object.
Application generation:
In the project process emphasizes reuse, many of the program components must
be tested and all interfaces must be fully exercised.
Project Plan:
The estimated tie for the development of this project is 150 Days.
It is described by Gantt chart and Pert Chart. The Software development life cycle will be
followed while developing the software.
GANTT CHART::
Gantt Chart:
Days
System Analysis
5
System study
Investigation 7
5
Data collection
Design
7
Consumer
7
Vendor
7
Employer
Supervisor 7
Coding
60
Coding The Task
Testing
Testin g Th e Sy stem 30
Documentation
15
Overall Finalisation
PERT CHART::
Pert Chart: 3
6 7
7 Consumer Unit Testing
15
System Investigation
Design
7 8
3 9
15
2 5 17 4 Supervisor Unit Testing
Coding
7 3
10 11 15
5 7 7 Employer Unit Testing
14 117
15
3 13
12 18
1 0 3 12
Unit Testing
Vendor
System Analysis 135
Data Collection
15
10
Testing
145
150
5 5
Documentation 16
Implementation
OBJECTIVE::
The e-Transport Management System is a web-based application which is very useful and a
faster process of managing the transport system in an Institute or any other business organization.
This web-based application is developed as per developing technology to meet the needs and
requirements of the people in the Organisation.
This whole system is known as e-Transport Management System. This System will use the latest
technologies in order to manage the transportation system of an organization.
It will have different modules and these modules will be synchronized with each other to make a
quick, efficient and user- friendly system.
It will have all the informations including vehicle booking, change of route, cancellation of pass.
It will also have facility to register new employees and give provisions to create monthly,
quarter-monthly and yearly pass along with vehicle time table. It will also have facility to send
message about change of route and timings of transport to the drivers and passengers.
The owner of the system will be the administrator and he can have a complete access to the
system. Also the vendors who will supply vehicles on different routes can also have access to the
system.
=>Environmental characteristics :-
Sempron or higher.
#Supervision
#Employer
#vendor
Functional Requirements:-
=>Pass genenration
=>Placing complain.
=>Emergency Planning
Non-Functional Requirements::
=>Error Correction- Ensure users can correct errors with minimal problems . If source and
destination are somehow slightly mis-stated, then correction would be easy.
Goals of implementation:-
DESIGN::
The Design of the System along with DFD, ER-Diagram and Data Dictionary are described as
follows:-
Context Diagram:
A context diagram is data flow diagram by which the whole system is modeled
by one process. It shows all the external entities that interact with the system
and the data flows between these external entities and the system.
Actually the system shown by the context diagram does not describe the
system in details. For more details it is necessary to identify the major system
process and draw a data flow diagram made up of these processes and the data
flow between them. Such a diagram is called a Top-Level DFD. We can go on
expanding each process of the top-level DFD into a more detailed DFD.
CONSUMER SUPERVISOR
E-TMS
ACCOUNT EMPLOYER
VENDOR
CONTEXT DIAGRAM
The data flow diagram is a graphical representation tool, which has the
purpose of clarifying system requirements and identifying major transformation
that will become programs in the system design. A DFD consists of a series of
bubble joined by the lines. The bubbles represent data transformations and lines
represents data flows in the system. It depicts the information flow and
transformation that occurs as data moves from input to output. The DFD
provides a mechanism for functional modeling as well as information flow
modeling.
Consumer:
1st level DFD Consumer Module :
Complaint
New Entry
Access
Record
Consumer
Register
Consumer
Complaint
Enquiry
New Entry
Enquiry
Places Complaints
Registration
New
CONSUMER Registr-
-ation
Selects T Schedule
Access Data
Pays
Access
Route
Selection
Route
Gets Bill
Process
Update
Bill Accounts
Printed t o Customer
Update Bill
Process
Access update
pass
Consumer Veh_Invoice
Solution
Access
Data
Access/
Manage Vendor
Access Vendor
Data Place
Invoice
Access
Record
Receive
Complaint Payment Update
Access
Complaint
Process
Bill
Provides
Solution
Update
Pay
Pay
Invoice
Accounts
Employer
Simulate
Cost
Cost Update
Measurement Route Tariff
T Schedule
Update
Supervisor
Process
Allot
Processes
Transport Vehicle
Schedule
Select Route
Broadcast
Message
Route
New Creation
Message To
Passing
Driver
To
Consumer
Vendor
Generate
Invoice
Invoice
Put/delete
Maintains
Invoice
Process
Access Previous
Data
Vehicle Veh_Maintenance
Maintenance
T Schedule
Access
data
Supervisor
Enquiry
Update
Transport
Schedule
Enquiry
Route
Enquiry
Route
Optimization
Update
Access
Route
Enquire
Bill
Access Data
Bill
Processing Veh_Invoice
Update/delete
Invoice
Enquiry
Access
Invoice
Modification Veh_Invoice
Update
We now review the full E R diagrams notation. The entity types are
shown in the rectangular boxes. The relationship types are shown in the diamond
shaped boxes attached to the participating entity types with straight lines.
Attributes are shown in ovals and a straight line attaches each attributes to its
entity type or relationship type. Component attributes of a composite attributes
are shown in double ovals. Key attributes have their names underlined. Derived
attributes are shown in dotted ovals. Weak attributes are distinguished by being
placed in double diamonds. The partial key of the weak entity types are
underlined with dotted lines.
ER- Model :
M 1 1
M
To Contact ROUTE
Address
1 M
1
Description
E-TM S
1
C_bal O_bal
Ma na ge
1 y
rv is ed b
Su p e Address
ACCOUNTS
M Cont _No
1 Nam e
Manage 1
Dat e
SUPERVISOR
Regulates 1 1
Nam e
1
Address Runs on
1
maintains
Pay EMPLOYER
Rout e_no
1 Veh_no
St opage
VEH INVOICE
AT DT M 1
1
VEHICLE
In
v oi c M
e to V_id
Invoice_No
ed by
Prov id
Dat e 1
Tot al 1
VENDOR Veh_No.
Veh_id
V_id Dri ver
Name Proprietor
Nam e Cont _No.
DATA DICTIONARY::
The term Data Dictionary and Data Repository are used to indicate a more general software
utility than a catalogue. A Catalogue is closely coupled with the DBMS Software; it provides the
information stored in it to user and the DBA, but it is mainly accessed by the various software
modules of the DBMS itself, such as DDL and DML compilers, the query optimiser, the
transaction processor, report generators, and the constraint enforcer. On the other hand, a Data
Dictionary is a data structure that stores meta-data, i.e., data about data. The Software package
for a stand-alone Data Dictionary or Data Repository may interact with the software modules of
the DBMS, but it is mainly used by the Designers, Users and Administrators of a computer
system for information resource management. These systems are used to maintain information on
system hardware and software configuration, documentation, application and users as well as
other information relevant to system administration.
The data Dictionary consists of record types (tables) created in the database by systems
generated command files, tailored for each supported back-end DBMS. Command files contain
SQL Statements for CREATE TABLE, CREATE UNIQUE INDEX, ALTER TABLE (for
referential integrity), etc., using the specific statement required by that type of database.
CONSUMER MODULE::
Consumer::
Name Varchar 50
Deptt Varchar 20
Designation Varchar 20
Contact_No Number 15
Email Number 30
Bill::
Bill_No Number 10
Amount Varchar 50
Pass::
Category Varchar 10
Route_No Varchar 50
Valid_Upto Varchar 50
Accounts::
Complain::
Date Number 10
Consumer Password::
SUPERVISOR MODULE
Schedule::
Stops Varchar 10
Veh_No Varchar 10
Arrival_Time Varchar 10
Dept_Time Varchar 10
Route::
Distance Number 10
Station::
Start_Name Varchar 10
Arrival_Time Varchar 10
Dept_Time Varchar 10
Supervisor::
Address Varchar 10
Contact_No Number 15
Supervisor Password::
EMPLOYER MODULE::
Invoice::
Emp_Name Varchar 50
Date Number 10
Total Number 50
Emp_A/C Number 20
Vendor_A/C Number 20
Employer Password::
VENDOR MODULE::
Vehicle::
Veh_No Number 50
Route_No Number 10
Driver_ID Number 10
Category Varchar 20
Capacity Number 10
Driver::
Name Varchar 10
License_No Number 10
Category Varchar 20
Vendor::
Name Varchar 20
Contact_No Number 10
Vehicle_Service::
Condition Varchar 10
Vendor Password::
performance requirement
exceptional handling
acceptance criteria
design hints and guidelines
PERFORMANCE REQUIREMENTS
The following performance characteristics were taken care of in developing the
systems:
User Friendliness:
The system is easy to learn and understand. A native user can also
use the system effectively, without any difficulty.
User satisfaction: The system is such that it stands unto the user ’s expectation.
Response time:
The response time of all the operations is very low. This has
been made possible by careful programming.
Error handling:
Safety:
Robustness:
Security:
To validations:
Portability:
Exception handling:
CODE EFFICIENCY:
perform quick, accurate retrieval of data, validation and showing outputs. In this
software, most of codes are used which were designed and tested by famous
vendors, such as, Microsoft, Crystal Corporation etc. We have used active x
technology which help user and vendor to design a software which provides
better, accurate design and reusability code, such as, ADO Data Mining
technology.
CODE OPTIMIZATION:
Most of codes are reused to reduce repeated coding and the result set are reused
where needed due to its modular concept, it is possible to reduce coding. The are
following points represent the code optimization.
use of modules
fixed type variable
short and meaning full name
disconcerted record
connection established once
in built function
different scope of variable for different purpose
maximum use of independent procedure
use of function
function
Use of Module
I have use short and meaning full name for example name for name
add for address etc.
Disconnect Record
In Built Function
Procedure
I have used procedure i.e. fetch the record, clear the text or caption
etc.
VALIDATION CHECKS:
i) data type
ii) length
iii) constraints
iv) blank field
v) format
Data type:
I have use character type for character, number for numeric, and
date for date type. No numeric field insert in date. Character never inputted in
numeric field as phone no never accept character if any person input wrongly
give message. When this problem is removed then user perform further operation
Length:
When we define a max length. Then it never accepts more data .for
example if I define numeric length is 5 then it store either equal to length or less
than length. If user gives more character than required then display message and
stop processing.
Constraints:
In this I am defining range of data if data is less than then display error
with message. For example code of product is four character purchase. At sale
item Mentor code is 21 characters. The field of date must be 8 character.
Blank field:
When users add data and some field is blank then it display message
without halt, But stop processing.
Format:
The pre define format is used not change daily to daily for example
format of date DDMMYYYY: 01012002 is used in all date type field. If user
inserts an other format then display message.
There are following rules that can serve well as testing objectives:
Cyclomatic Complexity:
V (G) =P+1
Condition Testing:
Branch Testing:
Loop Testing:
In our project I have use only simple loop. And I have use m pass
through the loop where m<n.
Equivalence Partitioning:
Check in phone number, code generation, account type, bill type, password etc.
TESTING STRATEGIES
System testing
In many organizations persons other than those who wrote the original
programs to ensure more complete and unbiased testing and more reliable testing
perform testing.
The norms that were followed during the phase were that after the developer of
the software has satisfied regarding every aspect of the software under
consideration he is required to release the program source code. A setup name
release is used to copy the name file from the developers’ user area to a project
area in the directory named with developer user name. Here all the final testing
used to be done by persons other than the developer himself .if some changes
were desired in the program the developer were required to use another setup.
Retrieve, which copied back the latest version of the program to developer
areas.
Since the user are not familiar to the new system the data screens were designed
in such a way that were-.
consistent
easy to use
has a fast response time
The following convention were used while designing the various screen:
Unit testing:
Integration testing:
System testing:
Test review:
Test review is the process that ensures that testing is carried out as
planned. Test review decides whether or not the program is ready to be shipped
out for implementation
Security Testing:
System implementation
Implementation is the process of having system personnel
check out and put new equipment into use train users install the new application
and construct any files of data needed to use it.
Depending on the size of the organization that will be involved in using the
application and the risk associated with its use developers may choose
To pilot the operation in only one area of the firm say in one department or with
only one or two persons. Sometimes they will run the old and new systems
together to compare the results. In still other situations developers will stop
using the old system one-day and begin to use the new one the next day. Anyway
each implementation strategy has its merits
Once installed applications are often used for many years. However both the
organization and the user will change and the environment will be different over
weeks and months. Therefore the application will be different over weeks and
months. Therefore the application will undoubtedly have to be maintained.
Modifications and changes will be made to software, files or procedures to meet
emerging user requirements. Since organization systems and the business
environment undergo continual change, the information systems should keep
pace. In this sense implementations an ongoing process.
USER TRAINING
which gave them the exact steps to be performed for getting their job done
starting from getting the terminals on. Most of the users were quick to get their
job done in a right way after the very first training class. They were given the
explicit advantages of the new system and also the areas it was having
shortcomings.
After this was the stabilizing the system as the users started to give
in new suggestions and requirements. For us the maintenance phase had begun.
SYSTEM EVALUATION
Operational evaluation
All the above aspects were very well taken into considerations from the
very beginning. They don’t have to keep checking the status of the job they are
doing, since every job is on-line and all information and messages flash at
screen.
The reliability of the billing, voucher entry is very high and till writing
of this document, the system hadn’t ever failed. All the recovery methods are
well written, even if something exceptional occurs it that user has a way to come
out of the undesirable situation and carry on the work. The committing takes
place when everything goes normal.
Organizational impact:
Since the response time and throughput has increased by manifolds, the
processing would increase, hence resulting in more profit for the company
specially with the increase in the ease in processing the M&SS can schedule
their manpower accordingly rather than trying up personnel for specific
applications. This will rejuvenate and boost the morale of the users.
Everybody has heaped accolades to the system for giving a reliable, fast
improvised with lot of ease in using system. Overall the system has changed the
users working style and their throughput.
Development performance
It includes evaluation of the development of process in
accordance with such yardsticks as overall development time and effort,
conformance to budgets and standards, and project management criteria.
Includes assessment of development methods and tools.
IMPLEMENTATION:
INPUT SCREENS:
OUTPUT SCREENS:
CODING:
CONCLUSION:
MAINTENANCE:
1. BREAKDOWN MAINTENANCE:-
The maintenance is applied when an error occurs & system
halts and further processing cannot be done .At this time user can view
documentation or consult us for rectification & we will analyze and change the
code if needed. Example: - If user gets an error “report width is larger than
paper size” while printing report & reports cannot be generated then by viewing
the help documentation & changing the paper size to ‘A4’ size of default printer
will rectify the problem.”
2. PREVENTATIVE MAINTENANCE: -
(a) Error Correction: - Errors, which were not caught during testing, after
the system has, been implemented. Rectification of such errors is called
corrective maintenance.
(b) New or changed requirements:- When business requirements changes
due to changing opportunities.
(c) Improved performance or maintenance requirements: - Changes that is
made to improve system performance or to make it easier to maintain in
SECURITY MEASURES:-
(1) A login password is provided in the software. User must login to activate the
application.
(2) User cannot change the password. To change password he must contact the
administrator.
(4) A primary key & foreign key concept is implemented for avoiding incorrect
data entry or intentional or accidental delete or modification of data.
(5) When user tries to delete the data then this first check for its reference used
by other data, if found the deletion aborted.
For a given set of requirements it is desirable to know how much it will cost to
develop the software to satisfy the given requirements, and how much time
development will take. These estimate are needed before development is
initiated .The primary reason for cost and schedule estimation is to enable client
or developer to perform a cost benefit analysis and for project monitoring and
control .A more practical use of these estimates is in bidding for software
projects, where the developers must give cost estimates to a potential client for
the development contract. For a software development project, detailed and
accurate cost and schedule estimate are essential prerequisites for managing the
project. Otherwise, even simple question like “Is the project late”, “Are there
cost overruns”, and “when is the project likely to complete” cannot be answered.
Cost and schedule estimate are also required to determine the staffing level for a
project a deferent phase. it can be safely said that cost and schedule estimates
are fundamentals to any form of management and are generally always required
for a project.
Model 1:- The basic COCOMO model computed software development effort
land cost as a function of program size expressed in estimate lines of code.
Model 3:- The advance COCOMO model incorporate all characteristics of the
intermediate version with an assessment of the cost drivers impact on each step
(analysis, design etc.) of software engineering process.
1) Organic Mode:- Relatively small, simple projects in which small team with
application experience work to a set of less than rigid requirements.
E = a b KLOC b b
D = CbEdb
ab = 2.4
bb = 1.05
cb = 2.5
db = 0.38
LOC = 1555
E = 2.4(KLOC) 1 . 0 5
= 2.4(1.555) 1 . 0 5
= 3.84
= 4 person-month
D = 2.5 E 0 . 3 5
= 2.5(4) 0 . 3 5
= 4.06 months
= 4 months approximately
the computer project duration we use the effort estimated described above
N = E/D
= 4/4
= 1 person
The Designing of the Sytem will involve the following platform and technologies::
APPENDIX A:
ASP.NET
ASP.NET Web pages, known officially as Web Forms are the main building block for application
development. Web forms are contained in files with an ".aspx" extension; these files typically
contain static (X)HTML markup, as well as markup defining server-side Web Controls and User
Controls where the developers place all the required static and dynamic content for the Web
page. Additionally, dynamic code which runs on the server can be placed in a page within a
block <% -- dynamic code -- %>, which is similar to other Web development technologies such
as PHP, JSP, and ASP. With ASP.NET Framework 2.0, Microsoft introduced a new code-behind
model which allows static text to remain on the .aspx page, while dynamic code remains in an
.aspx.vb or .aspx.cs or .aspx.fs file (depending on the programming language used).
A directive is special instructions on how ASP.NET should process the page. The most common
directive is <%@ Page %> which can specify many things, such as which programming
language is used for the server-side code.
Examples
Inline code
<script runat="server">
protected void Page_Load(object sender, EventArgs e)
{
// Assign the datetime to label control
lbl1.Text = DateTime.Now.ToLongTimeString();
}
</script>
<div>
The current time is: <asp:Label runat="server" id="lbl1" />
</div>
</form>
</body>
</html>
The above page renders with the Text "The current time is: " and the <asp:Label> Text is set with
the current time, upon render.
Code-behind solutions
<%@PageLanguage="C#"CodeFile="SampleCodeBehind.aspx.cs"
Inherits="Website.SampleCodeBehind"
AutoEventWireup="true" %>
The above tag is placed at the beginning of the ASPX file. The CodeFile property of the @
Page directive specifies the file (.cs or .vb or .fs) acting as the code-behind while
the Inherits property specifies the Class from which the Page is derived. In this example, the @
Page directive is included in SampleCodeBehind.aspx, then Sample Code Behind .aspx.cs acts as
the code-behind for this page:
using System;
namespace Website
{
public partial class SampleCodeBehind :System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
Response.Write("Hello, world");
}
}
}
Imports System
Namespace Website
Public Partial Class SampleCodeBehind
Inherits System.Web.UI.Page
Protected Sub Page_Load(ByVal sender As Object, ByVal e As EventArgs)
Response.Write("Hello, world")
End Sub
End Class
End Namespace
In this case, the Page_Load() method is called every time the ASPX page is requested. The
programmer can implement event handlers at several stages of the page execution process to
perform processing.
ASP.NET uses a visited composites rendering technique. During compilation, the template
(.aspx) file is compiled into initialization code which builds a control tree (the composite)
representing the original template. Literal text goes into instances of the Literal control class, and
server controls are represented by instances of a specific control class. The initialization code is
combined with user-written code (usually by the assembly of multiple partial classes) and results
in a class specific for the page. The page doubles as the root of the control tree.
Actual requests for the page are processed through a number of steps. First, during the
initialization steps, an instance of the page class is created and the initialization code is executed.
This produces the initial control tree which is now typically manipulated by the methods of the
page in the following steps. As each node in the tree is a control represented as an instance of a
class, the code may change the tree structure as well as manipulate the properties/methods of the
individual nodes. Finally, during the rendering step a visitor is used to visit every node in the
tree, asking each node to render itself using the methods of the visitor. The resulting HTML
output is sent to the client.
After the request has been processed, the instance of the page class is discarded and with it the
entire control tree. This is a source of confusion among novice ASP.NET programmers who rely
on class instance members that are lost with every page request/response cycle.
ASP.NET applications are hosted by a Web server and are accessed using
the stateless HTTP protocol. As such, if an application uses stateful interaction, it has to
implement state management on its own. ASP.NET provides various functions for state
management. Conceptually, Microsoft treats "state" as GUI state. Problems may arise if an
application needs to keep track of "data state"; for example, a finite-state machine which may be
in a transient state between requests (lazy evaluation) or which takes a long time to initialize.
State management in ASP.NET pages with authentication can make Web scraping difficult or
impossible.
Application state is held by a collection of shared user-defined variables. These are set and
initialized when the Application_OnStart event fires on the loading of the first instance of the
application and are available until the last instance exits. Application state variables are accessed
using the Applications collection, which provides a wrapper for the application state. Application
state variables are identified by name.
Server-side session state is held by a collection of user-defined session variables that are
persistent during a user session. These variables, accessed using the Session collection, are
unique to each session instance. The variables can be set to be automatically destroyed after a
defined time of inactivity even if the session does not end. Client-side user session is maintained
by either a cookie or by encoding the session ID in the URL itself.
In-Process Mode
The session variables are maintained within the ASP.NET process. This is the fastest way;
however, in this mode the variables are destroyed when the ASP.NET process is recycled or shut
down.
ASPState Mode
ASP.NET runs a separate Windows service that maintains the state variables. Because state
management happens outside the ASP.NET process, and because the ASP.NET engine accesses
data using .NET Remoting, ASPState is slower than In-Process. This mode allows an ASP.NET
application to be load-balanced and scaled across multiple servers. Because the state
management service runs independently of ASP.NET, the session variables can persist across
ASP.NET process shutdowns. However, since session state server runs as one instance, it is still
one point of failure for session state. The session-state service cannot be load-balanced, and there
are restrictions on types that can be stored in a session variable.
SqlServer Mode
State variables are stored in a database, allowing session variables to be persisted across
ASP.NET process shutdowns. The main advantage of this mode is that it allows the application
to balance load on a server cluster, sharing sessions between servers. This is the slowest method
of session state management in ASP.NET.
View state
View state refers to the page-level state management mechanism, utilized by the HTML pages
emitted by ASP.NET applications to maintain the state of the Web form controls and widgets.
The state of the controls is encoded and sent to the server at every form submission in a hidden
field known as __VIEWSTATE. The server sends back the variable so that when the page is re-
rendered, the controls render at their last state. At the server side, the application may change the
view state, if the processing requires a change of state of any control. The states of individual
controls are decoded at the server, and are available for use in ASP.NET pages using the View
State collection.[13] [14]
The main use for this is to preserve form information across post backs. View state is turned on
by default and normally serializes the data in every control on the page regardless of whether it is
actually used during a post back. This behavior can (and should) be modified, however, as View
state can be disabled on a per-control, per-page, or server-wide basis.
Developers need to be wary of storing sensitive or private information in the View state of a page
or control, as the base64 string containing the view state data can easily be de-serialized. By
default, View state does not encrypt the __VIEWSTATE value. Encryption can be enabled on a
server-wide (and server-specific) basis, allowing for a certain level of security to be maintained.
Server-side caching
ASP.NET offers a "Cache" object that is shared across the application and can also be used to
store various objects. The "Cache" object holds the data only for a specified amount of time and
is automatically cleaned after the session time-limit elapses.
Other
Other means of state management that are supported by ASP.NET are cookies, caching, and
using the query string.
APPENDIX B:
Visual Studio:
Visual Studio includes a code editor supporting IntelliSense as well as code refactoring. The
integrated debugger works both as a source-level debugger and a machine-level debugger. Other
built-in tools include a forms designer for building GUI applications, web
designer, class designer, and database schema designer. It accepts plug-ins that enhance the
functionality at almost every level including adding support for source-control systems
(like Subversion and Visual SourceSafe) and adding new toolsets like editors and visual
designers for domain-specific languages or toolsets for other aspects of the software
development lifecycle. (like the Team Foundation Server client: Team Explorer).
Visual Studio supports different programming languages by means of language services, which
allow the code editor and debugger to support (to varying degrees) nearly any programming
language, provided a language-specific service exists. Built-in languages include C/C+
+ (via Visual C++), VB.NET (via Visual Basic .NET), C# (via Visual C#), and F# (as of Visual
Studio 2010). Support for other languages such as Python, and Ruby among others is available
via language services installed separately. It also supports
XML/XSLT, HTML/XHTML, JavaScript and CSS. Individual language-specific versions of
Visual Studio also exist which provide more limited language services to the user: Microsoft
Visual Basic, , Visual C#, and Visual C++.
Visual Studio, like any other IDE, includes a code editor that supports syntax
highlighting and code completion using IntelliSense for not only variables, functions
and methods but also language constructs like loops and queries. IntelliSense is supported for the
included languages, as well as for XML and for Cascading Style Sheets and JavaScript when
developing web sites and web applications. Auto complete suggestions are popped up in a
modeless list box, overlaid on top of the code editor. In Visual Studio 2008 onwards, it can be
made temporarily semi-transparent to see the code obstructed by it. The code editor is used for
all supported languages.
The Visual Studio code editor also supports setting bookmarks in code for quick navigation.
Other navigational aids include collapsing code blocks and incremental search, in addition to
normal text search and regex search. The code editor also includes a multi-item clipboard and a
task list. The code editor supports code snippets, which are saved templates for repetitive code
and can be inserted into code and customized for the project being worked on. A management
tool for code snippets is built in as well. These tools are surfaced as floating windows which can
be set to automatically hide when unused or docked to the side of the screen. The Visual Studio
code editor also supports code refactoring including parameter reordering, variable and method
renaming, interface extraction and encapsulation of class members inside properties, among
others.
Visual Studio features background compilation (also called incremental compilation). As code is
being written, Visual Studio compiles it in the background in order to provide feedback about
syntax and compilation errors, which are flagged with a red wavy underline. Warnings are
marked with a green underline. Background compilation does not generate executable code,
since it requires a different compiler than the one used to generate executable code. Background
compilation was initially introduced with Microsoft Visual Basic but has now been expanded for
all included languages.
Visual Studio includes a debugger that works both as a source-level debugger and as a machine-
level debugger. It works with both managed code as well as native code and can be used for
debugging applications written in any language supported by Visual Studio. In addition, it can
also attach to running processes and monitor and debug those processes. If source code for the
running process is available, it displays the code as it is being run. If source code is not available,
it can show the disassembly. The Visual Studio debugger can also create memory dumps as well
as load them later for debugging. Multi-threaded programs are also supported. The debugger can
be configured to be launched when an application running outside the Visual Studio environment
crashes.
The debugger allows setting breakpoints (which allow execution to be stopped temporarily at a
certain position) and watches (which monitor the values of variables as the execution
progresses). Breakpoints can be conditional, meaning they get triggered when the condition is
met. Code can be stepped over, i.e., run one line (of source code) at a time. It can either step
into functions to debug inside it, or step over it, i.e., the execution of the function body isn't
available for manual inspection. The debugger supports Edit and Continue, i.e., it allows code to
be edited as it is being debugged (32 bit only; not supported in 64 bit). When debugging, if the
mouse pointer hovers over any variable, its current value is displayed in a tooltip ("data
tooltips"), where it can also be modified if desired. During coding, the Visual Studio debugger
lets certain functions be invoked manually from the Immediate tool window. The parameters to
the method are supplied at the Immediate window.
Visual Studio includes a host of visual designers to aid in the development of applications. These
tools include:
Visual Studio 2005 in Class Designer view Visual Studio Web Designer in code editor view.
Web designer/development
Visual Studio also includes a web-site editor and designer that allow web pages to be authored by
dragging and dropping widgets. It is used for developing ASP.NET applications and
supports HTML, CSS and JavaScript. It uses a code-behind model to link with ASP.NET code.
From Visual Studio 2008 onwards, the layout engine used by the web designer is shared
with Microsoft Expression Web. There is also ASP.NET MVC support for MVC technology as a
separate download and ASP.NET Dynamic Data project available from Microsoft.
Class designer
The Class Designer is used to author and edit the classes (including its members and their access)
using UML modeling. The Class Designer can generate C# and VB.NET code outlines for the
classes and methods. It can also generate class diagrams from hand-written classes.
Data designer
The data designer can be used to graphically edit database schemas, including typed tables,
primary and foreign keys and constraints. It can also be used to design queries from the graphical
view.
APPENDIX C:
SQL Server 2005, released in October 2005, is the successor to SQL Server 2000. It included
native support for managing XML data, in addition to relational data. For this purpose, it defined
an xml data type that could be used either as a data type in database columns or as literals in
queries. XML columns can be associated with XSD schemas; XML data being stored is verified
against the schema. XML is converted to an internal binary data type before being stored in the
database. Specialized indexing methods were made available for XML data. XML data is queried
using XQuery; Common Language Runtime (CLR) integration was a main feature with this
edition, enabling one to write SQL code as Managed Code by the CLR. SQL Server 2005 added
some extensions to the T-SQL language to allow embedding XQuery queries in T-SQL. In
addition, it also defines a new extension to XQuery, called XML DML, that allows query-based
modifications to XML data. SQL Server 2005 also allows a database server to be exposed
over web services using Tabular Data Stream (TDS) packets encapsulated within SOAP
(protocol) requests. When the data is accessed over web services, results are returned as XML.
For relational data, T-SQL has been augmented with error handling features (try/catch) and
support for recursive queries with CTEs (Common Table Expressions). SQL Server 2005 has
also been enhanced with new indexing algorithms, syntax and better error recovery systems.
Data pages are check summed for better error resiliency, and optimistic concurrency support has
been added for better performance. Permissions and access control have been made more
granular and the query processor handles concurrent execution of queries in a more efficient way.
Partitions on tables and indexes are supported natively, so scaling out a database onto a cluster is
easier. SQL CLR was introduced with SQL Server 2005 to let it integrate with the .NET
Framework.
SQL Server 2005 introduced "MARS" (Multiple Active Results Sets), a method of allowing
usage of database connections for multiple purposes.
SQL Server 2005 introduced DMVs (Dynamic Management Views), which are specialized views
and functions that return server state information that can be used to monitor the health of a
server instance, diagnose problems, and tune performance.
SQL Server 2005 introduced Database Mirroring, but it was not fully supported until the first
Service Pack release (SP1). In the initial release (RTM) of SQL Server 2005, database mirroring
was available, but unsupported. In order to implement database mirroring in the RTM version,
you had to apply trace flag 1400 at startup. Database mirroring is a high availability option that
provides redundancy and failover capabilities at the database level. Failover can be performed
manually or can be configured for automatic failover. Automatic failover requires a witness
partner and an operating mode of synchronous (also known as high-safety or full safety).
SQL Server Management Studio is a GUI tool included with SQL Server 2005 and later for
configuring, managing, and administering all components within Microsoft SQL Server. The tool
includes both script editors and graphical tools that work with objects and features of the
server.SQL Server Management Studio replaces Enterprise Manager as the primary management
interface for Microsoft SQL Server since SQL Server 2005. A version of SQL Server
Management Studio is also available for SQL Server Express Edition, for which it is known
as SQL Server Management Studio Express (SSMSE).
A central feature of SQL Server Management Studio is the Object Explorer, which allows the
user to browse, select, and act upon any of the objects within the server. It can be used to visually
observe and analyze query plans and optimize the database performance, among others.SQL
Server Management Studio can also be used to create a new database, alter any existing database
schema by adding or modifying tables and indexes, or analyze performance. It includes the query
windows which provide a GUI based interface to write and execute queries same.
APPENDIX D:
C#(SEE SHARP)
The name "C sharp" was inspired by musical notation where a sharp indicates that the written
note should be made a semitone higher in pitch. This is similar to the language name of C++,
where "++" indicates that a variable should be incremented by 1.
Due to technical limitations of display (standard fonts, browsers, etc.) and the fact that the sharp
symbol (U+266F ♯ MUSIC SHARP SIGN (HTML: ♯ )) is not present on the standard
keyboard, the number sign (U+0023 # NUMBER SIGN (HTML: # )) was chosen to represent
the sharp symbol in the written name of the programming language. This convention is reflected
in the ECMA-334 C# Language Specification. However, when it is practical to do so (for
example, in advertising or in box art), Microsoft uses the intended musical symbol.
The "sharp" suffix has been used by a number of other .NET languages that are variants of
existing languages, including J# (a .NET language also designed by Microsoft that is derived
from Java 1.1), A# (from Ada), and the functional programming language F#. The original
implementation of Eiffel for .NET was called Eiffel#, a name since retired since the full Eiffel
language is now supported. The suffix has also been used for libraries, such as Gtk# (a .NET
wrapper for GTK+ and other GNOME libraries), Cocoa# (a wrapper for Cocoa) and Qt# (a .NET
language binding for the Qt toolkit).
By design, C# is the programming language that most directly reflects the underlying Common
Language Infrastructure (CLI). Most of its intrinsic types correspond to value-types implemented
by the CLI framework. However, the language specification does not state the code generation
requirements of the compiler: that is, it does not state that a C# compiler must target a Common
Language Runtime, or generate Common Intermediate Language (CIL), or generate any other
specific format. Theoretically, a C# compiler could generate machine code like traditional
compilers of C++ or Fortran.
Some notable features of C# that distinguish it from C and C++ (and Java, where noted) are:
It has no global variables or functions. All methods and members must be declared within
classes. Static members of public classes can substitute for global variables and functions.
Local variables cannot shadow variables of the enclosing block, unlike C and C++.
Variable shadowing is often considered confusing by C++ texts.
C# supports a strict Boolean data type, bool. Statements that take conditions, such as
while and if, require an expression of a type that implements the true operator, such as the
boolean type. While C++ also has a boolean type, it can be freely converted to and from
integers, and expressions such as if(a) require only that a is convertible to bool, allowing
a to be an int, or a pointer. C# disallows this "integer meaning true or false" approach, on
the grounds that forcing programmers to use expressions that return exactly bool can
prevent certain types of common programming mistakes in C or C++ such as if (a = b)
(use of assignment = instead of equality ==).
In C#, memory address pointers can only be used within blocks specifically marked as
unsafe, and programs with unsafe code need appropriate permissions to run. Most object
access is done through safe object references, which always either point to a "live" object
or have the well-defined null value; it is impossible to obtain a reference to a "dead"
object (one that has been garbage collected), or to a random block of memory. An unsafe
pointer can point to an instance of a value-type, array, string, or a block of memory
allocated on a stack. Code that is not marked as unsafe can still store and manipulate
pointers through the System.IntPtr type, but it cannot dereference them.
Multiple inheritance is not supported, although a class can implement any number of
interfaces. This was a design decision by the language's lead architect to avoid
complication and simplify architectural requirements throughout CLI.
C# is more type safe than C++. The only implicit conversions by default are those that
are considered safe, such as widening of integers. This is enforced at compile-time,
during JIT, and, in some cases, at runtime. No implicit conversions occur between
booleans and integers, nor between enumeration members and integers (except for literal
0, which can be implicitly converted to any enumerated type). Any user-defined
conversion must be explicitly marked as explicit or implicit, unlike C++ copy
constructors and conversion operators, which are both implicit by default. Starting with
version 4.0, C# supports a "dynamic" data type that enforces type checking at runtime
only.
Checked exceptions are not present in C# (in contrast to Java). This has been a conscious
decision based on the issues of scalability and versionability.
The Mono project provides an open source C# compiler, a complete open source
implementation of the Common Language Infrastructure including the required
framework libraries as they appear in the ECMA specification, and a nearly complete
implementation of the Microsoft proprietary .NET class libraries up to .NET 3.5. As of
Mono 2.6, no plans exist to implement WPF; WF is planned for a later release; and there
are only partial implementations of LINQ to SQL and WCF.[42]
The Dot GNU project also provides an open source C# compiler, a nearly complete
implementation of the Common Language Infrastructure including the required
framework libraries as they appear in the ECMA specification, and subset of some of the
remaining Microsoft proprietary .NET class libraries up to .NET 2.0 (those not
documented or included in the ECMA specification, but included in Microsoft's
standard .NET Framework distribution).
Microsoft's Rotor project (currently called Shared Source Common Language
Infrastructure) (licensed for educational and research use only) provides a shared source
implementation of the CLR runtime and a C# compiler, and a subset of the required
Common Language Infrastructure framework libraries in the ECMA specification (up to
C# 2.0, and supported on Windows XP only).