Вы находитесь на странице: 1из 29

S.NO CONTENTS PAGE.

NO

ABSTRACTS

LIST OF TABLES

LIST OF FIGURES

1 INTRODUCTION

1.1 Problem Description

2 SYSTEM STUDY

2.1 Existing System

2.2 Proposed System

2.3 Need For Computerization

2.4 Data Flow Diagram

2.5 Class Diagram

3 SYSTEM CONFIGURATION

3.1 Hardware Requirements

3.2 Software Requirements

4 OVERVIEW OF THE SOFTWARE

5 DESIGN AND DEVELOPMENT

5.1 Use Case Diagram

5.2 Data Base Design

5.3 E-R Diagram

6 IMPLEMENTATION AND TESTING

7 CONCLUSION

8 APPENDICES
I. Source Code
II. Screenshots
ABSTRACTS

This project aims at providing an online platform where the records of a typical
vehicle insurance company are properly and efficiently managed so as to ensure
improvement in the productivity of operations of the company. The proposed vehicle
insurance policy system is a web-based application that aims to develop a complete and fully
functional independent system to manage records of vehicle insurance companies. It is
developed with the intent of providing such insurance companies an online platform for
accurate processing, organized data handling, and efficient retrieval and storage of records.
LIST OF TABLES

S.NO TABLES PAGE.NO

5
LIST OF FIGURES

S.NO DIAGRAM PAGE.NO

5
1. INTRODUCTION

This section deals with the concept of system analysis, which is the primary phase of the
software development. The purpose is to identify the new system and establish what the new
system is to accomplish. Moreover, a brief review of requirement determination, report
feasibility analysis and finding are recorded here. System analysis is an important activity
that takes place, when a new system is being built. It is the central intact of the system
development and it includes gathering necessary data and developing a plan to the new
system. It is not an easy task because many people need to be satisfied and many conflicts are
to be resolved. System analysis should be creative and imaginative in producing new
solutions to meet the user requirements.
1.1 Problem Description

The starting point of any system is to understand what the issues are on hand and therefore
what need to be done. In fact the same problem across different organization could be
handled differently depending on various issues like organizational culture and resources
constraints etc. At this stage it was necessary to look at the following.

Problem introduction or problem stating is the starting point of the software development
activity. The objective of this statement is to answer: Exactly what must the system do? The
software project is initiated by the client’s need. This application achieves as the follows:

Insurance Details, Vehicle Details, Customer Details, Payment, and Time Period that are all
maintained by this application.
2. SYSTEM STUDY

2.1 Existing System

Like many other existing system, the current vehicle insurance management procedure is
very traditional involving a lot of paper work and manpower. The current system cannot
ensure effective data processing, so it is very insecure.

Disadvantage
 Data is maintained in papers
 Customer visits an agent and asks him to insure his vehicle.
 The system is very uneconomical, tedious, and time taking.
 Retrieval of data is done manually
2.2 Proposed System

The proposed vehicle insurance policy system is a web-based application that aims to
develop a complete and fully functional independent system to manage records of vehicle
insurance companies. The overall project is developed taking into account the importance of
a software application for insurance-related data management, report generation, handling
customer information, etc.
Advantage
 Proper management of vehicle insurance details like adds new vehicle details, modify
existing records, delete existing records, etc.
 Non-motor insurance details option in insurance category for customers to provide
their details and personal information.
 Report generation feature for every data and record entered into the system database.
 Use of report generation module to add, modify, and delete records to/from the
database.
 Easily access customer’s information from organized database.
2.3 Need for Computerization

The main objective of this project is to manage the details of customer, insurance,
vehicle, vehicle type. This project aims at providing an online platform where the records of
typical vehicle insurance are properly and efficiently managed so as to ensure improvement
in the productivity of operations .It is developed with the intent of providing such insurance
companies an online platform for accurate processing, organized data handling, and efficient
retrieval and storage of records.

The project Vehicle Insurance Management System is to maintain all records of vehicle
insurance such as 2 Wheeler vehicle, 3 wheeler vehicle, heavy vehicle etc, The main
objective of this project is to simple application for insurance companies for managing
customers who buy new vehicles and take insurance for that vehicles. This system is
managing the details of payment, time period, vehicle details, and customer personal details,
insurance specifications are updated to database. In this system admin can add, delete, and
modify, records and search for old records with in short time.
2.4 Data Flow Diagram

Manage vehicle
Logon to details
Admin
system
Manage
insurance details

Manage vehicle
type details
Forgot Check
password credential Manage customer
s details

Manage payment
details
Send Manage
email to modules
user Manage report
2.5 Class Diagram
3. SYSTEM CONFIGURATION

3.1 Hardware Requirements

Hard disk : 160 GB


RAM : 4 GB
Processor : Core i3
Monitor : 15’’Color Monitor

3.2 Software Requirements

Front-End : ASP.NET
Back end : SQL server
Operating System : Windows 7
IDE : Visual Studio
4. OVERVIEW OF THE SOFTWARE

Microsoft .Net Framework


The .Net framework is a software development platform developed by Microsoft. The
framework was meant to create applications, which would run on the Windows Platform.
The first version of the .Net framework was released in the year 2002.
The version was called .Net framework 1.0. The .Net framework has come a long way since
then, and the current version is 4.7.1.
The .Net framework can be used to create both - Form-based and Web-based applications.
Web services can also be developed using the .Net framework.
The framework also supports various programming languages such as Visual Basic and C#.
So developers can choose and select the language to develop the required application. In this
chapter, you will learn some basics of the .Net framework.
In this tutorial, you will learn-
.Net Framework Architecture
.NET Components
.Net Framework Design Principle
.Net Framework Architecture
The basic architecture of the .Net framework is as shown below.
.NET Framework
.net framework architecture diagram
.NET Components
The architecture of the .Net framework is based on the following key components;
1. Common Language Runtime
The "Common Language Infrastructure" or CLI is a platform on which the .Net programs are
executed.
The CLI has the following key features:
Exception Handling - Exceptions are errors which occur when the application is executed.
Examples of exceptions are:
If an application tries to open a file on the local machine, but the file is not present.
If the application tries to fetch some records from a database, but the connection to the
database is not valid.
Garbage Collection - Garbage collection is the process of removing unwanted resources
when they are no longer required.
Examples of garbage collection are
A File handles which is no longer required. If the application has finished all operations on a
file, then the file handle may no longer be required.
The database connection is no longer required. If the application has finished all operations
on a database, then the database connection may no longer be required.
Working with Various programming languages –
As noted in an earlier section, a developer can develop an application in a variety of .Net
programming languages.
Language - The first level is the programming language itself, the most common ones are
VB.Net and C#.
Compiler – There is a compiler which will be separate for each programming language. So
underlying the VB.Net language, there will be a separate VB.Net compiler. Similarly, for C#,
you will have another compiler.
Common Language Interpreter – This is the final layer in .Net which would be used to run a
.net program developed in any programming language. So the subsequent compiler will send
the program to the CLI layer to run the .Net application.
2. Class Library
The .NET Framework includes a set of standard class libraries. A class library is a collection
of methods and functions that can be used for the core purpose.
For example, there is a class library with methods to handle all file-level operations. So there
is a method which can be used to read the text from a file. Similarly, there is a method to
write text to a file.
Most of the methods are split into either the System.* or Microsoft.* namespaces. (The
asterisk * just means a reference to all of the methods that fall under the System or Microsoft
namespace)
A namespace is a logical separation of methods. We will learn these namespaces more in
detail in the subsequent chapters.
3. Languages
The types of applications that can be built in the .Net framework is classified broadly into the
following categories.
WinForms – This is used for developing Forms-based applications, which would run on an
end user machine. Notepad is an example of a client-based application.
ASP.Net – This is used for developing web-based applications, which are made to run on any
browser such as Internet Explorer, Chrome or Firefox.
The Web application would be processed on a server, which would have Internet Information
Services Installed.
Internet Information Services or IIS is a Microsoft component which is used to execute an
Asp.Net application.
The result of the execution is then sent to the client machines, and the output is shown in the
browser.
ADO.Net – This technology is used to develop applications to interact with Databases such
as Oracle or Microsoft SQL Server.
Microsoft always ensures that .Net frameworks are in compliance with all the supported
Windows operating systems.
.Net Framework Design Principle
The following design principles of the .Net framework is what makes it very relevant to
create .Net based applications.
Interoperability - The .Net framework provides a lot of backward support. Suppose if you
had an application built on an older version of the .Net framework, say 2.0. And if you tried
to run the same application on a machine which had the higher version of the .Net
framework, say 3.5. The application would still work. This is because with every release,
Microsoft ensures that older framework versions gel well with the latest version.
Portability- Applications built on the .Net framework can be made to work on any Windows
platform. And now in recent times, Microsoft is also envisioning to make Microsoft products
work on other platforms, such as iOS and Linux.
Security - The .NET Framework has a good security mechanism. The inbuilt security
mechanism helps in both validation and verification of applications. Every application can
explicitly define their security mechanism. Each security mechanism is used to grant the user
access to the code or to the running program.
Memory management - The Common Language runtime does all the work or memory
management. The .Net framework has all the capability to see those resources, which are not
used by a running program. It would then release those resources accordingly. This is done
via a program called the "Garbage Collector" which runs as part of the .Net framework.
The garbage collector runs at regular intervals and keeps on checking which system
resources are not utilized, and frees them accordingly.
Simplified deployment - The .Net framework also have tools, which can be used to package
applications built on the .Net framework. These packages can then be distributed to client
machines. The packages would then automatically install the application.
Summary
.Net is a programming language developed by Microsoft. It was designed to build
applications which could run on the Windows platform.
The .Net programming language can be used to develop Forms based applications, Web
based applications, and Web services.
Developers can choose from a variety of programming languages available on the .Net
platform. The most common ones are VB.Net and C#.
SQL Server
Like other RDBMS software, Microsoft SQL Server is built on top of SQL, a standardized
programming language that database administrators (DBAs) and other IT professionals use to
manage databases and query the data they contain. SQL Server is tied to Transact-SQL (T-
SQL), an implementation of SQL from Microsoft that adds a set of proprietary programming
extensions to the standard language.
The original SQL Server code was developed in the 1980s by the former Sybase Inc., which
is now owned by SAP. Sybase initially built the software to run on Unix systems and
minicomputer platforms. It, Microsoft and Ashton-Tate Corp., then the leading vendor of PC
databases, teamed up to produce the first version of what became Microsoft SQL Server,
designed for the OS/2 operating system and released in 1989.
Ashton-Tate stepped away after that, but Microsoft and Sybase continued their partnership
until 1994, when Microsoft took over all development and marketing of SQL Server for its
own operating systems. The year before, with the Sybase relationship starting to unravel,
Microsoft had also made the software available on the newly released Windows NT after
modifying the 16-bit OS/2 code base to create a 32-bit implementation with added features; it
focused on the Windows code going forward. In 1996, Sybase renamed its version Adaptive
Server Enterprise, leaving the SQL Server name to Microsoft.
Versions of SQL Server
Between 1995 and 2016, Microsoft released 10 versions of SQL Server. Early versions were
aimed primarily at departmental and workgroup applications, but Microsoft expanded SQL
Server's capabilities in subsequent ones, turning it into an enterprise-class relational DBMS
that could compete with Oracle Database, DB2 and other rival platforms for high-end
database uses. Over the years, Microsoft has also incorporated various data management and
data analytics tools into SQL Server, as well as functionality to support new technologies that
emerged, including the web, cloud computing and mobile devices.
Microsoft SQL Server 2016, which became generally available in June 2016, was developed
as part of a "mobile first, cloud first" technology strategy adopted by Microsoft two years
earlier. Among other things, SQL Server 2016 added new features for performance tuning,
real-time operational analytics, and data visualization and reporting on mobile devices, plus
hybrid cloud support that lets DBAs run databases on a combination of on-premises systems
and public cloud services to reduce IT costs. For example, a SQL Server Stretch Database
technology moves infrequently accessed data from on-premises storage devices to the
Microsoft Azure cloud, while keeping the data available for querying, if needed.
Key components in Microsoft SQL Server
SQL Server 2016 also increased support for big data analytics and other advanced analytics
applications through SQL Server R Services, which enables the DBMS to run analytics
applications written in the open source R programming language, and PolyBase, a
technology that lets SQL Server users access data stored in Hadoop clusters or Azure blob
storage for analysis. In addition, SQL Server 2016 was the first version of the DBMS to run
exclusively on 64-bit servers based on x64 microprocessors. And it added the ability to run
SQL Server in Docker containers, a virtualization technology that isolates applications from
each other on a shared operating system.
Prior versions included SQL Server 2005, SQL Server 2008 and SQL Server 2008 R2, which
was considered a major release despite the follow-up sound of its name. Next to come were
SQL Server 2012 and SQL Server 2014. SQL Server 2012 offered new features, such as
columnstore indexes, which can be used to store data in a column-based format for data
warehousing and analytics applications, and AlwaysOn Availability Groups, a high
availability and disaster recovery technology. (Microsoft changed the spelling of the latter's
name to Always On when it released SQL Server 2016.)
SQL Server 2014 added In-Memory OLTP, which lets users run online transaction
processing (OLTP) applications against data stored in memory-optimized tables instead of
standard disk-based ones. Another new feature in SQL Server 2014 was the buffer pool
extension, which integrates SQL Server's buffer pool memory cache with a solid-state drive -
- another feature designed to boost I/O throughput by offloading data from conventional hard
disks.
Microsoft SQL Server ran exclusively on Windows for more than 20 years. But, in 2016,
Microsoft said it planned to also make the DBMS available on Linux, starting with a new
version released as a community technology preview that November and initially dubbed
SQL Server vNext; later, the update was formally named SQL Server 2017, and it became
generally available in October of that year.
The support for running SQL Server on Linux moved the database platform onto an open
source operating system commonly found in enterprises, giving Microsoft potential inroads
with customers that don't use Windows or have mixed server environments. SQL Server
2017 also expanded the Docker support added for Windows systems in the previous release
to include Linux-based containers.
Another notable feature in SQL Server 2017 is support for the Python programming
language, an open source language that is widely used in analytics applications. With its
addition, SQL Server R Services was renamed Machine Learning Services (In-Database) and
expanded to run both R and Python applications. Initially, the machine learning toolkit and a
variety of other features are only available in the Windows version of the database software,
with a more limited feature set supported on Linux.
Inside SQL Server's architecture
Like other RDBMS technologies, SQL Server is primarily built around a row-based table
structure that connects related data elements in different tables to one another, avoiding the
need to redundantly store data in multiple places within a database. The relational model also
provides referential integrity and other integrity constraints to maintain data accuracy; those
checks are part of a broader adherence to the principles of atomicity, consistency, isolation
and durability -- collectively known as the ACID properties and designed to guarantee that
database transactions are processed reliably.
The core component of Microsoft SQL Server is the SQL Server Database Engine, which
controls data storage, processing and security. It includes a relational engine that processes
commands and queries, and a storage engine that manages database files, tables, pages,
indexes, data buffers and transactions. Stored procedures, triggers, views and other database
objects are also created and executed by the Database Engine.
Sitting beneath the Database Engine is the SQL Server Operating System, or SQLOS; it
handles lower-level functions, such as memory and I/O management, job scheduling and
locking of data to avoid conflicting updates. A network interface layer sits above the
Database Engine and uses Microsoft's Tabular Data Stream protocol to facilitate request and
response interactions with database servers. And at the user level, SQL Server DBAs and
developers write T-SQL statements to build and modify database structures, manipulate data,
implement security protections and back up databases, among other tasks.
SQL Server services, tools and editions
Microsoft also bundles a variety of data management, business intelligence (BI) and analytics
tools with SQL Server. In addition to the R Services and now Machine Learning Services
technology that first appeared in SQL Server 2016, the data analysis offerings include SQL
Server Analysis Services, an analytical engine that processes data for use in BI and data
visualization applications, and SQL Server Reporting Services, which supports the creation
and delivery of BI reports.
On the data management side, Microsoft SQL Server includes SQL Server Integration
Services, SQL Server Data Quality Services and SQL Server Master Data Services. Also
bundled with the DBMS are two sets of tools for DBAs and developers: SQL Server Data
Tools, for use in developing databases, and SQL Server Management Studio, for use in
deploying, monitoring and managing databases.
Microsoft offers SQL Server in four primary editions that provide different levels of the
bundled services. Two are available free of charge: a full-featured Developer edition for use
in database development and testing, and an Express edition that can be used to run small
databases with up to 10 GB of disk storage capacity. For larger applications, Microsoft sells
an Enterprise edition that includes all of SQL Server's features, as well as a Standard one
with a partial feature set and limits on the number of processor cores and memory sizes that
users can configure in their database servers.
However, when SQL Server 2016 Service Pack 1 (SP1) was released in late 2016, Microsoft
made some of the features previously limited to the Enterprise edition available as part of the
Standard and Express ones. That included In-Memory OLTP, PolyBase, columnstore
indexes, and partitioning, data compression and change data capture capabilities for data
warehouses, as well as several security features. In addition, the company implemented a
consistent programming model across the different editions with SQL Server 2016 SP1,
making it easier to scale up applications from one edition to another.
Security features in SQL Server
The advanced security features supported in all editions of Microsoft SQL Server starting
with SQL Server 2016 SP1 include three technologies added to the 2016 release: Always
Encrypted, which lets user update encrypted data without having to decrypt it first; row-level
security, which enables data access to be controlled at the row level in database tables; and
dynamic data masking, which automatically hides elements of sensitive data from users
without full access privileges.
Other notable SQL Server security features include transparent data encryption, which
encrypts data files in databases, and fine-grained auditing, which collects detailed
information on database usage for reporting on regulatory compliance. Microsoft also
supports the Transport Layer Security protocol for securing communications between SQL
Server clients and database servers.
Most of those tools and the other features in Microsoft SQL Server are also supported in
Azure SQL Database, a cloud database service built on the SQL Server Database Engine.
Alternatively, users can run SQL Server directly on Azure, via a technology called SQL
Server on Azure Virtual Machines; it configures the DBMS in Windows Server virtual
machines running on Azure. The VM offering is optimized for migrating or extending on-
premises SQL Server applications to the cloud, while Azure SQL Database is designed for
use in new cloud-based applications.
In the cloud, Microsoft also offers Azure SQL Data Warehouse, a data warehousing service
based on a massively parallel processing (MPP) implementation of SQL Server. The MPP
version, originally a stand-alone product called SQL Server Parallel Data Warehouse, is also
available for on-premises uses as part of the Microsoft Analytics Platform System, which
combines it with Polybasic and other big data technologies.
5. DESIGN AND DEVELOPMENT

5.1 Use Case Diagram

manage user

Manage registration

Admin
Manage insurance

login and logout


from system

update profile

user
manage insurance
period

manage payments
5.2 Data Base Design
5.3 E-R Diagram
6. IMPLEMENTATION AND TESTING

Modules

Login Module

Admin and customers can use their respective log in forms to log in to the system
providing user id or security details such as username and password.

Existing Records

All existing records in the system database can be viewed, modified, and deleted by
the admin. Further, records of similar type can be added in respective corners of the system.

Vehicle Insurance Records

This module contains vehicle insurance details that are provided by the user while
purchasing the vehicle. The records here contain information regarding Vehicle details, total
years of insurance, amount paid, etc.

Vehicle Search

Based on registration number, insurance number, vehicle number, or customer details,


the administrator can search for records previously stored in the database.

Vehicle Report

In this module, a report containing insurance details can be generated in a file and
sent to the customer in printed form.

Setting Module

Basic system settings such as user id (username and password), layout, profiles, etc.
can be changed by the admin.

Software Testing
The purpose of testing is to discover errors. Testing is the process of trying to
discover every conceivable fault or weakness in a work product. It provides a way to check
the functionality of components, sub assemblies, assemblies and/or a finished product It is
the process of exercising software with the intent of ensuring that the Software system meets
its requirements and user expectations and does not fail in an unacceptable manner. There are
various types of test. Each test type addresses a specific testing requirement.

Types of Tests:
Testing is the process of trying to discover every conceivable fault or weakness in a
work product. The different type of testing are given below:
Unit Testing:
Unit testing involves the design of test cases that validate that the internal program
logic is functioning properly, and that program inputs produce valid outputs. All decision
branches and internal code flow should be validated. It is the testing of individual software
units of the application .it is done after the completion of an individual unit before
integration.
This is a structural testing, that relies on knowledge of its construction and is
invasive. Unit tests perform basic tests at component level and test a specific business
process, application, and/or system configuration. Unit tests ensure that each unique path of a
business process performs accurately to the documented specifications and contains clearly
defined inputs and expected results.
Integration Testing:
Integration tests are designed to test integrated software components to determine if
they actually run as one program. Testing is event driven and is more concerned with the
basic outcome of screens or fields. Integration tests demonstrate that although the
components were individually satisfaction, as shown by successfully unit testing, the
combination of components is correct and consistent. Integration testing is specifically aimed
at exposing the problems that arise from the combination of components.
Functional Test:
Functional tests provide systematic demonstrations that functions tested are available
as specified by the business and technical requirements, system documentation, and user
manuals.
Functional testing is centered on the following items:
Valid Input : identified classes of valid input must be accepted.
Invalid Input : identified classes of invalid input must be rejected.
Functions : identified functions must be exercised.
Output : identified classes of application outputs must be exercised.
Systems/ Procedures : interfacing systems or procedures must be invoked.
Organization and preparation of functional tests is focused on requirements, key functions, or
special test cases. In addition, systematic coverage pertaining to identify Business process
flows; data fields, predefined processes, and successive processes must be considered for
testing. Before functional testing is complete, additional tests are identified and the effective
value of current tests is determined.
System Test:
System testing ensures that the entire integrated software system meets requirements. It tests
a configuration to ensure known and predictable results. An example of system testing is the
configuration oriented system integration test. System testing is based on process
descriptions and flows, emphasizing pre-driven process links and integration points.
White Box Testing:
White Box Testing is a testing in which in which the software tester has knowledge of the
inner workings, structure and language of the software, or at least its purpose. It is purpose. It
is used to test areas that cannot be reached from a black box level.
Black Box Testing:
Black Box Testing is testing the software without any knowledge of the inner workings,
structure or language of the module being tested. Black box tests, as most other kinds of
tests, must be written from a definitive source document, such as specification or
requirements document, such as specification or requirements document. It is a testing in
which the software under test is treated, as a black box .you cannot “see” into it. The test
provides inputs and responds to outputs without considering how the software works.
7. CONCLUSION

Effectiveness, efficiency, and reliability are the key aspects that make this web-based vehicle
insurance management system very useful for vehicle showroom business. The proposed
project is very flexible to handle new modules and features as per user requirements in
future. It can also be integrated with other systems such as vehicle tracking system, vehicle
information management system, vehicle registration system, etc.
8. APPENDICES

Source Code

Screenshots

Вам также может понравиться