You are on page 1of 17

CONTAINTS

1.INTRODUCTION
2.AR-Toolkit
3.Adobe Flash Builder 4.0
4. Ransac Algorithm
5. DATA FLOW DIAGRAM
6. TOOLS AND PLATFORM
7. MODULE DESCRIPTION
8. LIST OF OUTPUT
9.Conclusion
10.References

INTRODUCTION
The Augmented Reality in Classroom Learning is an application that can be used by
students, teachers, or any other person who doesnt want to traditional approaches to
convey any message and wants to be in flow with the technological advancement which
is constantly making the life easier and comfortable.

A user just need to show the pattern /Marker in front of the webcam and the message he
or she wants to convey will be displayed at the screen in the form of graphics or any other
form (text).
Our project on marker detection in Augmented Reality will identify a given pattern and
thus making the use of the pattern to display information, but its applications are much
more, some which are given below:
Making learning more interactive
Making teaching easier for teachers
New dimensions leads to better understanding of concepts

AR Toolkit:
2

ARToolKit is a computer tracking library for creation of strong augmented reality


applications that overlay virtual imagery on the real world. To do this, it uses video
tracking capabilities that calculate the real camera position and orientation relative
to square physical markers in real time. Once the real camera position is known a
virtual camera can be positioned at the same point and 3D computer graphics
models drawn exactly overlaid on the real marker. So ARToolKit solves two of the
key problems in Augmented Reality; viewpoint tracking and virtual object
interaction.
ARToolKit was originally developed by Hirokazu Kato of Nara Institute of Science
and Technology in 1999 and was released by the University of Washington HIT
Lab. Currently it is maintained as an open source project hosted on SourceForge
with commercial licenses available from ARToolWorks. ARToolKit is a very
widely used AR tracking library with over 160,000 downloads since 2004.

Adobe Flash Builder 4.0:


Adobe Flash Builder (previously known as Adobe Flex Builder) is an integrated
development environment (IDE) built on the Eclipse platform that speeds
development of rich Internet applications (RIAs) and cross-platform desktop
applications, particularly for the Adobe Flash platform.
Adobe Flash Builder 4 is available in three editions: Standard, Premium and
Educational. The package is available free of charge for non-commercial use by
students and unemployed developers.
Adobe Flash Builder offers built-in code editors for MXML and ActionScript and
a WYSIWYG editor for modifying MXML applications. Adobe Flash Builder
includes an interactive debugger allowing developers to step through code
execution while inspecting variables and watching expressions. The profiling view
displays statistical information about memory use in addition to function call
execution time.
Prior to version 4, this product was known as Flex Builder. The name change is
meant to signify its connection to other products in the Adobe Flash Platform and
to create a clear distinction between the Flex SDK and the IDE.

FLAR Manager:
FLARToolKit is the name of the Library we will use in order to implement AR in
our project. This library was created by Saqoosha and is based on NyARToolkit
2.0.0. It is available as open source under the GNU General Public License.

Paper Vision 3D:


It is the name of the library that specifically deals with the augmentation part of the
entire marker detection process. It is actually the collection of tools and methods
that define the virtual objects that are mapped two the planar marker surface and
giving an impression of a 3D object, thus augmenting the real life marker with an
object which is of certain significance.

Blender:
Blender is a free and open-source 3D computer graphics software product used for
creating animated films, visual effects, interactive 3D applications or video games.
Blender's features include 3D modeling, UV unwrapping, texturing, rigging and
skinning, fluid and smoke simulation, particle simulation, animating, rendering,
video editing and compositing.

RANSAC is an abbreviation for RANdom SAmple Consensus. Its an algorithm to


find groups of inliers, i.e. points which can be fitted into a line.
The RANSAC algorithm in short:
1. Randomly choose 2 points from the same region, whose orientations are
compatible with the line joining the two points, to hypothesize a line.
2. The number of points supporting each line is counted. To be considered part of a
line an point must lie close to it and have a compatible orientation with the line.
3. Steps 1 and 2 are repeated ~25 times before a segment is found.
4. Lines with enough support (at least 4 segments) are considered detected lines.
Steps 1 to 3 are repeated until all such lines have been found.
5

DATAFLOW DIAGRAMS
LEVEL 0
Initially in the first level of the Data flow the level 0 explains the basic
outline of the system. The end-user sends the packets to the system to determine
the source and destination address. The diagram marked as the 0 represents the
complete Packet watching system which simply represents the basic operation that
is being performed by it in the initial level.
LEVEL 1
The level 1 of the Data flow diagram given explains in detail about the
Packet watching system which was marked as 0 in the previous level. In this level
the end-user who passes the request for the system enters into the first process,
the capturing process and then to the processing module. After processing the
packets it was send for storing.
LEVEL 2
The level 2 provides the clear explanation about the whole system. In this
level first we have to select the packet and perform test over that selected
packets. Then identify the end address of the packet and send that packet for
processing. After processing the packet it was send to the identity content. Then
send the processed packet for storing and display the source and destination
addresses.
6

DATA FLOW DIAGRAMS

ZERO(0) - LEVEL DIAGRAM

Marker
Marker
Enters the data

Argumented
Reality On
Solar System
Administrate the
entire process

Generates the entire


processes reports

ADMINISTRATOR

Admin

First Level DFD

Administrator

1.0
Admin
Module

Admin

Info about Admin


Give permission

Marker details

2.0
Marker
Recognis
ation

Marker

Generate reports
3.0
Reports
Module

3D Image

Audio Play
Z+, Z-

Second level DFD


2nd Level DFD for Admin Module

Administrator

1.0
Admin
Module

Admin

1.1
Data Entry
of Marker

Marker

1.2
Update
Marker

1.3
3DImage

Image

1.4
Entry For
Audio
File

Audio

ER- Diagram for Admin

Discrippti
on
Containe
r
Mid

Na
me

id

Image

Marker Details

Audi
o

discriptio
n
Audio
File

Markeri
d
Audio

Z+ ,Z-

E
m
be
d

H
a
s

Ad
d

Admin
Iname
discrip
tion

Iid
Mid

Mid
Contain
er

Movem
ent

Marker
Detaction

Se
arc
h
3D Image
Marker

10

HARDWARE AND SOFTWARE REQUIREMENTS


HARDWARE REQUIREMENTS
(i) Minimum 1GB RAM, Pentium P4 40/Core2duo, 80 GB Hard disk
(ii) Support for Printer (dot-matrix/ DeskJet /Inkjet etc.-any will do) that is,
appropriate drivers are installed and printer connected Printer will be required for
printing of markers & reports.

SOFTWARE REQUIREMENTS
Operating System Used -: Window XP
Language Used

-: ActionScript

S/w Used

-: Flash Builder 4.0, FLAR Toolkit, Pv3D, Flash

Player 10.2, Blender


At least 1GB RAM and 2GB space on hard disk will be required for
application.

11

running the

MODULE DESCRIPTION
This application will be a window based, self-contained and independent

software

product.

SYSTEM INTERFACES
Adobe Flash Player
Web Cam

USER INTERFACES
The Design or layout of every form will be clear and very interactive to the

user.
When the user opens the application, the Flash window will appear.
In that window the image being captured by the webcam will be displayed.
The user shows the marker in front of the camera lens.
If the marker is a valid one i.e. its pattern exists in the database, the marker is

said to be detected.
In the flash window we will then see the identified pattern being augmented by
virtual 3D object.
The user can now interact with that 3D object via marker and thus get better
understanding of the concept being depicted.
The user can thus make use of two or more virtual objects and make interaction
among those and thus creating new concepts and a better understanding.

LIST OF OUTPUT

12

13

14

CONCLUSION
The document aims at defining the overall software requirements for
AUGMENTED REALITY IN CLASSROOM LEARNING. Efforts have made
to define the requirements exhaustively and accurately. The final
product will be having only features/functionalities mentioned in this
document and assumptions for any additional feature should not made
by any of the parties involved in developing/testing/implementing
using the software. In case it is required to have some additional
features, a formal change request will need to be raised and
subsequently a new release of this document or product will be
produced.
This specification document describes the functionalities that will
provided by the software application Augmented Reality In Classroom
Learning .It also states the various required constraints by which the
system will abide. The intended audience for this document are the
development team, testing team, analyzers and end users of the
product.
The software product Augmented Reality In classroom Learning is a
Flash Application that will be used for Recognition of patterns in the
15

marker and then augmenting them with virtual reality. The Software
Requirements Specification (SRS) captures all the requirements, needs
and functionalities in a single document. The Augmented Reality In
Classroom

Learning

application

provide

its

user

graphical

extravaganza for a simple pattern shown in the camera.


The

rest

SRS

requirements,

document

will

describes

the

various

system

interfaces, features and functionalities in detail.

REFERENCES
[1] S. Fronz, X. Zhang, and N. Navab.
Visual Marker Detection and Decoding in AR systems.
In Proc.of IEEE International symposium on Mixed and AR (ISMAR02).
[2] ArToolKit. www.hitl.washington.edu/research/
shared space/download/.
[3] Martin Hirzer, Graz university of Tech. Marker Detection For
Augmented Reality
[4] ARVIKA. http://www.arvika.de/www/index.htm.
[5] D. Comaniciu, V. Ramesh, and P. Meer. Real-time
tracking of non-rigid objects using mean shift. In
16

Proc.IEEE Conf.Comp.Vision Patt.R ecog., 2000.


[6] M. Appel and N. Navab. Registration of technical
drawings and calibrated images for industrial augmented
reality. In IEEE Workshop on Applications
of Computer Vision, 2000.

17

You might also like