You are on page 1of 64

UNITED STATES PATENT AND TRADEMARK OFFICE

BEFORE THE PATENT TRIAL AND APPEAL BOARD

APPLE INC.,
Petitioner
v.
IMMERSION CORPORATION,
Patent Owner
U.S. Patent No. 8,659,571
Filing Date: February 21, 2013
Issue Date: February 25, 2014
Title: Interactivity Model for Shared Feedback on Mobile Devices

Inter Partes Review No.: (Unassigned)

PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 8,659,571


UNDER 35 U.S.C. 311-319 AND 37 C.F.R. 42.1-100, ET SEQ.

TABLE OF CONTENTS
I.

COMPLIANCE WITH FORMAL REQUIREMENTS .................................1


A.

Mandatory Notices Under 37 C.F.R. 42.8(b)(1)-(4) .......................1


1.

Real Party-In-Interest .................................................................1

2.

Related Matters ..........................................................................1

3.

Lead and Backup Counsel .........................................................2

4.

Service Information....................................................................2

B.

Proof of Service on the Patent Owner ..................................................2

C.

Power of Attorney ................................................................................2

D.

Standing ................................................................................................3

E.

Fees .......................................................................................................3

II.

STATEMENT OF PRECISE RELIEF REQUESTED ..................................3

III.

FULL STATEMENT OF REASONS FOR REQUESTED RELIEF ............4


A.

Technology Background ......................................................................4

B.

Summary of the 571 Patent .................................................................4

C.

The 571 Patent Prosecution History ...................................................5

D.

Person of Ordinary Skill in the Art ......................................................6

E.

Apple Products Accused of Infringing the 571 Patent .......................6

F.

Domestic Industry Products Alleged to Practice the 571 Patent ........ 7

G.

Claim Construction ..............................................................................7


1.

gesture signal ..........................................................................8

2.

dynamic interaction parameter ...............................................9

3.

vector signal............................................................................9

4.

on screen signal ....................................................................10

5.

generating a dynamic interaction parameter using a


physical model........................................................................10

6.

generating a dynamic interaction parameter using an


animation ................................................................................10

7.

module ..................................................................................11
i

Table of Contents
(continued)
Page
H.

Ground 1: Claims 1-7, 12-18, and 23-29 are Obvious Under 35


U.S.C. 103(a) (pre-AIA) in Light of Burrough ...............................12

I.

Ground 2: Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29
are Obvious Under 35 U.S.C. 103(a) (pre-AIA) in Light of
Rosenberg 373...................................................................................32

J.

Ground 3: Claims 3, 14 and 25 are Obvious Under 35 U.S.C.


103(a) (pre-AIA) in Light of Rosenberg 373 and Rosenberg
846 .....................................................................................................53

IV.

THE GROUNDS OF INVALIDITY ARE NOT REDUNDANT ...............56

V.

CONCLUSION.............................................................................................57

ii

EXHIBIT LIST
Exhibit No.

Description

1001

U.S. Patent No. 8,659,571.

1002

Declaration of expert Dr. Patrick Baudisch (Baudisch Decl.).

1003

File history of U.S. Patent No. 8,659,571.

1004

U.S. Patent No. 5,734,373 to Rosenberg et al. (Rosenberg 373).

1005

U.S. Patent Application No. 2010/0156818 to Burrough et al.


(Burrough).
U.S. Patent No. 6,429,846 to Rosenberg et al. (Rosenberg 846).

1006
1007

File history of U.S. Patent App. No. 13/472,698 (the 698


application).

1008

Excerpts from Barrons Dictionary of Mathematics Terms, 3rd ed.


(2009).

1009

Excerpts from The American Heritage Dictionary of the English


Language, 5th ed. (2011).

1010

Patent Owner Immersions disclosure of preliminary claim


constructions (Jun. 3, 2016).

1011

Patent Owner Immersions claim chart regarding alleged


infringement of the 571 patent by certain Apple iPhone products
(Exhibit 5 to Immersions supplemental response to Apples
interrogatory no. 19 in the ITC investigation).

1012

Patent Owner Immersions second claim chart regarding alleged


technical domestic industry for the 571 patent (Exhibit 51 to
Immersions ITC Complaint) .

iii

Apple Inc. (Apple or Petitioner) hereby petitions for inter partes review
of U.S. Patent No. 8,659,571 (the 571 patent). Ex. 1001. The571 patent
generally relates to providing dynamic haptic feedback in response to signals
representing user gestures on a user interface device, such as a touchscreen or
joystick. The claims of the 571 patent challenged in this Petition are invalid in
view of Apples patent, U.S. Patent Application No. 2010/0156818 to Burrough et
al. (Burrough). Most of the challenged claims also are invalid in view Patent
Owner Immersions (Patent Owner or Immersion) earlier patents, U.S. Patent
No. 5,734,373 to Rosenberg et al. (Rosenberg 373), which issued over a decade
before the 571 patent was filed. The remaining claims directed to on-screen
gesture signals are invalid in view of Rosenberg 373 in combination with another
of Immersions patents from the same lead inventor, U.S. Patent No. 6,429,846 to
Rosenberg et al. (Rosenberg 846).
I.

COMPLIANCE WITH FORMAL REQUIREMENTS


A.

Mandatory Notices Under 37 C.F.R. 42.8(b)(1)-(4)


1.

Real Party-In-Interest

Apple is the real party-in-interest.


2.

Related Matters

The 571 patent is subject to the following actions: 1) Certain Mobile


Electronic Devices Incorporating Haptics (Including Smartphones and
Smartwatches) and Components Thereof, U.S. International Trade Commission
1

Investigation No. 337-TA-990; and 2) Immersion Corporation v. Apple Inc., et al.,


Case No. 1:16-cv-00077 (D. Del.).
3.

Lead and Backup Counsel

Lead counsel is James M. Heintz, Reg. No. 41,828, of DLA Piper LLP (US),
11911 Freedom Drive, Suite 300; Reston, VA 20190; Apple-ImmersionIPRs@dlapiper.com, 703-773-4148 (phone), 703-773-5200 (fax). Backup counsel
is Robert Buergi, Reg. No. 58,125, of DLA Piper LLP (US); 2000 University Ave;
East Palo Alto, CA 94330; robert.buergi@dlapiper.com, 650-833-2407 (phone),
650-687-1144 (fax).
4.

Service Information

Service information for lead and back-up counsel is provided in the


designation of lead and back-up counsel above.
B.

Proof of Service on the Patent Owner

As identified in the attached Certificate of Service, a copy of this Petition in


its entirety is being served to the Patent Owners attorney of record at the address
listed in the USPTOs records by overnight courier pursuant to 37 C.F.R. 42.6.
C.

Power of Attorney

Powers of attorney are being filed with designation of counsel in accordance


with 37 C.F.R. 41.10(b).

D.

Standing

In accordance with 37 C.F.R. 42.104(a), Petitioner certifies that the 571


patent is available for inter partes review and that Petitioner is not barred or
estopped from requesting an inter partes review challenging the patent claims on
the grounds identified in this Petition.
E.

Fees

The undersigned authorizes the Director to charge the fee specified by 37


C.F.R. 42.15(a) and any additional fees that might be due in connection with this
Petition to Deposit Account No. 50-1442.
II.

STATEMENT OF PRECISE RELIEF REQUESTED


In accordance with 35 U.S.C. 311, Petitioner requests cancelation of

claims 1-7, 12-18, and 23-29 of the 571 patent in view of the following grounds:
A.

Claims 1-7, 12-18, and 23-29 are obvious under 35 U.S.C. 103(a)

(pre-AIA) in light of U.S. Patent Application No. 2010/0156818 to Burrough et al.


(Burrough).
B.

Claims 1, 2, 4-7, 12, 13, 15-18, and 23, 24, 26-29 are obvious under

35 U.S.C. 103(a) (pre-AIA) in light of U.S. Patent No. 5,734,373 to Rosenberg et


al. (Rosenberg 373).
C.

Claims 3, 14 and 25 are obvious under 35 U.S.C. 103(a) (pre-AIA)

in light of Rosenberg 373 and U.S. Patent No. 6,429,846 to Rosenberg et al.
(Rosenberg 846).
3

III.

FULL STATEMENT OF REASONS FOR REQUESTED RELIEF


A.

Technology Background

Haptics generally refers to the use of the sense of touch, especially in


computer systems. As the 571 patent explains, haptic feedback such as vibration
effects, can provide cues that enhance and simplify the user interface. Ex. 1001 at
1:22-33. Such effects may be useful in providing cues to users of electronic
devices to alert the user to specific events or provide realistic feedback to create
greater sensory immersion within a simulated or virtual environment. Id. In
electronic devices, vibration effects may be generated using an actuator, a type of
motor that converts electricity into motion. Id. at 1:34-41.
B.

Summary of the 571 Patent

The 571 patent is titled Interactivity Model For Shared Feedback On


Mobile Devices. Ex. 1001 at cover. The 571 patent states that [t]raditional
architectures that provide haptic feedback only with triggered effects are
available, and they must be carefully designed to make sure the timing of the
haptic feedback is correlated to user initiated gestures or system animations. Id.
at 1:49-52. However, because these user gestures and system animations have
variable timing, the correlation to haptic feedback may be static and inconsistent
and therefore less compelling to the user. Id. at 1:53-56. Further, device sensor
information is typically not used in combination with gestures to product haptic
feedback. Id. at 1:56-57. The 571 patent states that, therefore, there is a need
4

for an improved system of providing a dynamic haptic effect that includes multiple
gesture signals and device sensor signals. Id. at 1:58-60. To solve these
problems, the 571 patent discloses a system for providing dynamic haptic
effects based upon gesture signals and/or device sensor signals. Id. at 1:66-2:5. A
dynamic haptic effect refers to a haptic effect that evolves over time as it responds
to one or more input parameters. Id. at 2:65-67.
C.

The 571 Patent Prosecution History

The claims of the 571 patent originally appeared in previously filed


application, U.S. Patent App. No. 13/472,698 (the 698 application). Ex. 1007 at
815-820 (Aug. 24, 2012 Amendment). The Examiner, Grant Sitta, rejected these
claims as anticipated or obvious in view of another Immersion patent application,
U.S. Patent Pub. 2010/0017489 (Birnbaum). Id. at 842-853 (Nov. 28, 2012
Rejection). Immersion eventually abandoned the 698 application after two more
rejections, each finding amended claims obvious. Id. at 932-945 (Jun. 8, 2013
Rejection); 995-1013 (Dec. 24, 2013 Rejection); 1044 (Jun. 30, 2014 Notice of
Abandonment).
While the 698 application was pending, Immersion filed the 571 patent
application with identical claims to those that stood rejected in the 698
application. Compare Ex. 1003 at IMMR-ITC-00001020-25 with Ex. 1007 at 815820. The 571 patent application was examined by a different Examiner, Stephen

Sherman. Ex. 1001 at cover. However, Immersion did not inform Mr. Sherman
that identical claims had been rejected in the 698 application. See generally Ex.
1003. Because the claims challenged in this Petition had been found to be
anticipated or obvious in view of Birnbaum, they should never have been granted.
And, although this Petition does not rely upon Birnbaum, this Petition
demonstrates that the challenged claims are also obvious in view of three other
references, Burrough, Rosenberg 373 and Rosenberg 864.
D.

Person of Ordinary Skill in the Art

A person of ordinary skill in the art (POSITA) at the time of the alleged
invention of the 571 patent would have had a Bachelors degree in computer
science, electrical engineering, or a comparable field of study, plus approximately
two to three years of professional experience with software engineering, haptics
programming, or other relevant industry experience. Additional graduate
education could substitute for professional experience and significant experience in
the field could substitute for formal education. Ex. 1002, 38.
E.

Apple Products Accused of Infringing the 571 Patent

In the ITC investigation, Immersion has alleged that claims 1-7, 12-18, and
23-29 of the 571 patent are practiced by certain Apple iPhone products. Ex. 1011.
To support these allegations, Immersion provided claim charts purporting to show
how Apples iPhone 6s and 6s Plus products allegedly practice these claims. Id.

F.

Domestic Industry Products Alleged to Practice the 571 Patent

A patent owner is required to show technical domestic industry in an ITC


investigation. To do so, the patent owner must show that it or one of its licensees
practice at least one claim of an asserted patent.
In the ITC investigation, Immersion has alleged that at least claims 12 and
14 of the 571 patent are practiced by mobile devices that use Immersions
TouchSense software. Ex. 1012 at 2, 71. To support this allegation, Immersion
provided two claim charts purporting to show how its TouchSense software on a
mobile device allegedly practices these claims. Ex. 1012 (Immersions technical
domestic industry claim charts).
G.

Claim Construction

In accordance with 37 C.F.R. 42.104(b)(3), Petitioner provides the


following statement regarding construction of the 571 patent claims. A claim
subject to inter partes review receives the broadest reasonable interpretation
(BRI) in light of the specification. 37 C.F.R. 42.100(b).
In the ITC investigation referenced above, Immersion has proposed claim
constructions for one claim term (dynamic interaction parameter) of the 571
patent. Ex. 1010 at 2. Immersion also has submitted to the ITC technical domestic
industry claim charts showing how Immersion believes that certain claims of
the571 patent encompass aspects of Immersions technology, and claim charts

showing how Immersion believes that the 571 patents claims allegedly
encompass certain of Petitioner Apples products, as described above. Exs. 1011,
1012. For the purposes of this proceeding, Petitioner respectfully requests that
Immersion be held to constructions at least as broad as those set forth by
Immersion in these claim charts and in its proposed claim constructions as
discussed below.
1.

gesture signal

The term gesture signal (claims 1-7, 12-18, 23-29) should be broadly
construed to encompass a signal indicating user interaction with a user interface
device. The 571 patent describes a gesture as any movement of the body that
conveys meaning or user intent. Ex. 1001 at 3:35-36. The patent further
describes a gesture as any form of hand movement recognized by a device
having an accelerometer, gyroscope, or other motion sensor, and converted to
electronic signals. Id. at 3:56-59. The specification describes various exemplary
user interface devices that produce gesture signals, including a touch sensitive
surface, or any other type of user interface such as a mouse, touchpad,
minijoystick, scroll wheel, trackball, game pads or game controllers. Id. at 4:5963. Thus, in the context of the specification, a gesture signal is described as a
signal indicating user interaction with a user interface device. Petitioner submits
that the BRI of gesture signal should encompass these descriptions.

Immersion may argue that a gesture signal has a special meaning limited
to signals resulting from the interaction of fingers on touchscreens when
performing finger movements such as swipes. However, this argument must be
rejected because limiting gesture signal in this manner is contrary to the broad
definition of gesture discussed above, and reads out embodiments involving
systems that do not include touchscreens but instead use devices such as
minijoysticks, mouses, and trackballs as user input devices as discussed at 4:59-63.
2.

dynamic interaction parameter

In the ITC Investigation, Immersion has proposed that dynamic interaction


parameter (claims 1, 4-7, 12, 15-18, 23, 26-29) be construed to mean an
interaction parameter that changes over time or reacts in real time. Ex. 1010 at 2.
Although Petitioner disagrees with this construction, Petitioner submits that
Immersion should be held to a construction at least as broad as the construction it
proposed in the ITC investigation.
3.

vector signal

The term vector signal (claims 2, 13, 24) should be construed to


encompass a signal that includes both a magnitude and direction. Ex. 1008
(Barrons Dictionary of Mathematics) at 3 (vector: a vector is a quantity that has
both magnitude and direction.); Ex. 1009 (American Heritage Dictionary) at 3
(vector 1. Mathematics a. a quantity, such as velocity, completely specified by

a magnitude and a direction).


4.

on screen signal

Based on Immersions public contentions, Immersion should be held to a


construction of this claim limitation (claims 3, 14, 25) that encompasses a signal
generated based on interactions with a touch screen. See Ex. 1011 at 78-79
(Immersion contending that gesture signals generated when a user touches the
touchscreen of accused Apple products satisfies this limitation).
5.

generating a dynamic interaction parameter using a


physical model

The 571 patent describes a physical model as a mathematical model


related to a real-world physical effect such as gravity, acceleration, friction or
inertia. Ex. 1001 at 12:38-44. Thus, Petitioner submits that this limitation
(claims 5, 16, 27) should encompass generating a dynamic interaction based on
such a mathematical model.
Moreover, based on Immersions public contentions, Immersion should be
held to a construction that also encompasses generating a dynamic interaction
parameter using a model of properties of a human finger on a touchscreen. See Ex.
1011 at 78-79 (contending that a model of properties of the human finger,
including expected dimensions, behavior, average force of a touch, and electrical
properties on a touchscreen satisfies this limitation).
6.

generating a dynamic interaction parameter using an


10

animation
Based on Immersions public contentions, Immersion should be held to a
construction of this limitation (claims 6, 17, 28) that encompasses generating a
dynamic interaction parameter that is coordinated with an animation. For example,
Immersion contends that the accused Apple products generate a dynamic
interaction parameter corresponding to the amount of pressure exerted on the touch
screen. Ex. 1011 at 38-39. Immersion contends that the first gesture signal is
received when a user presses lightly on, e.g., an email (referred to as a Peek
gesture) and the second gesture signal is received when a user presses deeply to
pop into the email (referred to as a Pop gesture). Id. at 4-6, 22-24.
Immersion further contends that the use of animations relevant to Peek and Pop
satisfies this limitation. Ex. 1011 at 83.
7.

module

The term module (claims 12, 15-18) should be construed to encompass a


set of instructions executed by a processor. Module is used in the larger tern
drive module twice in the 571 patent specification. Ex. 1001 at 4:33; Fig. 1.
The patent describes drive module 22 as instructions that, when executed by a
processor 12, generate drive signals for actuator 18. Id. at 4:33-35; see also Fig. 1
(depicting drive module 22 as a part of memory 20). Thus, while module is a
broad term that could include other things, in light of the specification it must be

11

broad enough to include a set of instructions executed by a processor.


Because the BRI standard is different from that used in district court
litigation, see In re Am. Acad. of Sci. Tech. Ctr., 367 F.3d 1359, 1364, 1369 (Fed.
Cir. 2004), the interpretation of the claims presented either implicitly or explicitly
herein should not be viewed as constituting Petitioners own interpretation and/or
construction of such claims for the purposes of the underlying litigation. Instead,
such constructions in this proceeding should be viewed only as constituting an
interpretation of the claims under the broadest reasonable construction standard
and/or under the Immersions infringement contentions and technical domestic
industry contentions in the ITC Investigation.
H.

Ground 1: Claims 1-7, 12-18, and 23-29 are Obvious Under 35


U.S.C. 103(a) (pre-AIA) in Light of Burrough.

Claims 1-7, 12-18, and 23-29 are rendered obvious by U.S. Patent
Application No. 2010/0156818 to Burrough et al. (Burrough), assigned to
Petitioner Apple. Burrough was published on Jun. 24, 2010, more than one year
before the earliest possible priority date of the 571 patent (Aug. 23, 2012), and is
therefore prior art to the 571 patent under 35 U.S.C. 102 (b) (pre-AIA). Ex.
1005 at cover.
Burrough discloses providing multi-touch haptic feedback on a device
with a multi-touch touch based input device, such as touch screen. Id. at [0010];
[0017]. An example of such a device is shown in Fig. 1B:
12

Id. at Fig. 1B. Burrough discloses that the touch screen can recognize at least two
substantially simultaneously occurring gestures using at least two different fingers
or other object. Id. at [0035]. Such gestures include gestures associated with
zooming, panning, scrolling, rotating, enlarging and/or the like. Id. at [0017].
Burrough further discloses providing dynamic haptic feedback in response to
gestures on the touch screen. Id. at [0051]. For example, vibrations can be
adjusted based on a change in touch characteristics (i.e. speed, direction location,
etc.). Id. at [0051].
In one embodiment, Burrough discloses a multi-touch zoom gesture, in
which an image can be zoomed in or out by moving two fingers apart or together,
13

respectively. Id. at [0080]; Fig. 11. Burrough discloses that the amount of
zooming and the associated haptic effect varies according to the distance between
the two [fingers]. Id. at [0081]. For example, the haptic effect can be, for
example, faster (or slower) or more intense (or less intense) vibration as the
distance between the two fingers increases. Id. at [0080].
As illustrated in Figs. 12C and 12D, the haptic effect associated with a
multi-touch zoom gesture can be a function of the distance between the two
fingers1:

Id. at Figs. 12C, 12D; [0082]. In these figures, the magnitude of the haptic
response H(d) at each finger is denoted by the size of the circle for each response.
In this case, as the distance between the two fingers increases, the haptic effect H
1

In its technical domestic industry contentions in the ITC Investigation, Immersion

likewise identifies haptic effects associated with a multi-touch zoom gesture as


practicing claims 12 and 14. See e.g. Ex. 1012 at 35-36, 45-47.
14

for each finger increases with distance d. Id.


As discussed below, Burrough discloses and/or renders obvious all of the
limitations of the challenged claims.
Claim Language
1.pre. A method of
producing a haptic
effect comprising:

Burrough
Burrough discloses that the invention relates, in one
embodiment, to an apparatus and method for providing
multi-touch haptic feedback. Ex. 1005 at [0010]; [0003].
Burrough further discloses that the described embodiments
generally pertain to gestures and methods of implementing
gestures with associated physical feedback with touch
sensitive devices. Id. at [0035].
See also Ex. 1002, 55-56.

1.a. receiving a first Burrough discloses a touch sensitive surface is arranged to


gesture signal;
receive different types of user touch events each being
characterized by an amount of pressure applied on the touch
sensitive surface by a user. Ex. 1005 at [0016].
Burrough discloses gestures on the touch sensitive screen,
such as gestures [] associated with zooming, panning,
scrolling, rotating, enlarging, floating controls, zooming
targets, paging, inertia, keyboarding, wheeling, and/or the
like. Id. at [0017]
For example, Burrough discloses a zoom gesture method
1100 where the presence of at least a first finger and a
second finger are detected on a touch sensitive surface of the
surface 126 at about the same time. Id. at [0079]; see also
Fig. 11, 12A-H.
Burrough further discloses that a touch event T is initiated
each time an object, such as a users finger, is placed on
upper surface 126 over, or in close proximity to, sensing
region 128. In response to the pressure applied by the
user during touch event T, sensing device 124 generates
touch signal S1 (and any other signal consistent with a multi15

touch event) (gesture signal). Id. at [0046].

1.b. receiving a
second gesture
signal;

See also Ex. 1002, 57-60.


See limitation 1.a.

As established above, Burrough discloses a multi-touch zoom gesture in


which first and second fingers are detected on the touch screen at the same time.
Burrough further discloses that sensing device 124 generates signals representing
each touch on the touchscreen. Thus, a POSITA would understand that the sensing
device generates a first gesture signal representing one of the two fingers on the
touchscreen, and a second gesture signal representing the other finger on the
touchscreen. Ex. 1002, 62.
Claim Language
1.c. generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal; and

Burrough
Burrough discloses that [o]ne of the advantages of the
invention lies in the fact that the relationship between a
touch event or a class of touch events and corresponding
haptic response can be dynamic in nature For example,
vibrations can be adjusted based on a change in touch
characteristics (i.e., speed, direction, location, etc.). Ex.
1005 at [0051].
As one example, Burrough discloses a zoom gesture
method 1100, in which the distance between at least the
two fingers is compared If the distance between the two
fingers increases (spread apart) at 1110, a zoom-in signal is
generated at 1112, otherwise a zoom out signal is generated
at block 1114. The zoom-in signal, in turn, causes the
haptic devices associated with the two fingers to provide a
zoom-in haptic signal at 1116. Such a zoom in haptic signal
can be, for example, faster (or slower) or more intense (or
less intense) vibration as the distance between the two
16

fingers increases. Id. at [0080]; see also [0034].


Burrough discloses that the haptic profiles for each of the
fingers relating the distance d between the two fingers
corresponding to the haptic response H(d) experienced at
each finger are shown in Figs. 12A-12H. Id. at [0082].

Id. at Figs. 12C-12D; see also Figs. 12A-B, E-H.


Burrough discloses that the magnitude of the haptic
response H(d) at each finger is denoted by the size of the
circle for each response. In this case, as the distance
between the two fingers increases, the haptic effect H for
each finger increases linearly with distance d. Id. at
[0082].
Burrough further discloses that as the zoom factor
increases, the haptic profile H(d) can change by, for
example, the slope becoming more steep as the resolution of
the underlying map increases as shown in FIG. 12G. Id.;
see also Fig. 12G.
See also Ex. 1002, 63-70.
Thus, Burrough discloses that the haptic effect corresponding to each finger
in a multi-touch zoom gesture is a function of the distance d between the two
fingers. Ex. 1002, 68. For example, in one described embodiment, the the
haptic effect H for each finger increases linearly with distance d. Ex. 1005 at
[0082]. In other words, the magnitude of the haptic effect increases as the distance
17

increases, and decreases as the distance decreases. Ex. 1002, 68. The function
that defines the relationship between the haptic effect and the distance is referred to
as a haptic profile. Ex. 1005 at [0082]. Burrough discloses generating a haptic
response H(d) by applying the haptic profile corresponding to each finger to the
distance d between the fingers. Ex. 1002, 68.
Burrough further describes an embodiment in which the haptic profile
defining the haptic effect for each finger itself varies as a function of the zoom
factor, for example, by increasing the slope as the resolution of the underlying map
increases. Id. at [0082]. In other words, the rate at which the magnitude of the
haptic effect may change in response to a change in distance between the two
fingers can increase as the resolution of the map increases. Ex. 1002, 69.
The haptic response H(d) is a dynamic interaction parameter under
Immersions interpretation of that claim term, as discussed above in Section
III.G.2, because it changes over time or reacts in real time based upon the users
interaction with the touchscreen. Specifically, as the users fingers move apart, the
distance between the fingers increases, and the haptic response likewise increases
as a function of this distance. Similarly, as users fingers move together, the
distance between the fingers decreases, and the haptic response likewise decreases
as a function of this distance. Ex. 1002, 70.
Claim Language
1.d. applying a

Burrough
Burrough discloses that the touch sensitive input device
18

drive signal to a
haptic output device
according to the
dynamic interaction
parameter.

communicates with an array of haptic feedback devices


(also referred to as haptic actuators) each arranged to
provide haptic feedback according to a haptic profile in
response to a multi-touch event. Ex. 1005 at [0035].
Burrough further discloses [m]icrocontroller 132 can use
touch information Tinfo to query haptic data base 134 that
includes a number of predetermined haptic profiles each of
which describes a specific haptic response H, in terms of
duration of response, type of vibrotactile response, strength
of response, etc. A particular haptic profile includes a set of
instructions that cause microcontroller 132 to activate at
least haptic actuator 136. Haptic actuator 136, in turn,
creates haptic response Hx. In this way, the response of
haptic actuator 136 can be controlled in real time by
microprocessor 132 by establishing the duration, strength,
type of vibrotactile response Hx. Id. at [0047].
Burrough discloses that [h]aptic actuator 300 generates
force F directly proportional to voltage V applied to the
haptic actuator 300 by the controller (drive signal). Id. at
[0056].
See also Ex. 1002, 71-75.

Thus, Burrough discloses that microcontroller 132 supplies a voltage V (a


drive signal) to one or more haptic actuators to generate haptic feedback in the
form of vibration on the touchscreen. Ex. 1002, 74. The force of the haptic effect
is directly proportional to the supplied voltage. Ex. 1005 at [0056].
Microcontroller 132 uses touch information (e.g. the location or movement of the
users fingers on the touchscreen) and a haptic profile (e.g. defining a haptic
response as a function of the touch information) to control the haptic actuators.
Ex. 1005 at [0047]. Thus, in the case of the zoom gesture described above, a
19

POSITA would appreciate that haptic effects of increasing or decreasing strength


would be implemented by supplying the haptic actuator with increasing or
decreasing voltages, respectively. Ex. 1002, 74. In other words, Burrough
discloses applying voltages (i.e. drive signals) to the haptic actuator (i.e. haptic
output device) according to the change in haptic response H(d) (dynamic
interaction parameter) as the users fingers move together and apart. Id.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on Burroughs disclosure above, including Burroughs disclosure that haptic
actuator 136 can be controlled in real time by microprocessor 132 by establishing
the duration, strength, type of vibrotactile response Hx. Ex. 1005 at [0047]. It
would have been obvious to a POSITA to generate a drive signal for the haptic
actuators according to the haptic response determined in the zoom gesture
algorithm, for example, to generate haptic effects that vary as a function of the
distance between the users fingers. Ex. 1002, 75. Motivation to do so comes
from, for example, Burroughs disclosure of generating a haptic effect H for each
finger [that] increases linearly with distance the distance between the users
fingers. Ex. 1005 at [0082].
Claim Language
2. The method of
claim 1 wherein the
first or second

Burrough
Burrough discloses that vibrations can be adjusted based on
a change in touch characteristics (i.e., speed, direction,
location, etc.). Ex. 1005 at [0051].
20

gesture signal
comprises a vector
signal.

Burrough further discloses that the corresponding haptic


response H, can vary depending upon the location on
surface 126 of touch event T (i.e., T(x)) [or] any finger
motion (dT/dx, dT/dy). Id. at [0051].
See also Ex. 1002, 76-78.

A POSITA would understand that motion of a users finger represented by


dT/dx (i.e. the change in position of the finger along the x axis) and/or dT/dy (i.e.
the change in position of the finger in the y axis) would have both a magnitude (i.e.
the difference between the current and previous position) and a direction (i.e. in the
positive or negative direction along the axis). Ex. 1002, 78. Further, in the case
of the zoom gesture discussed above, Burrough discloses a haptic effect that varies
based upon whether the distance between the two fingers (magnitude) is increasing
or decreasing (direction). Ex. 1005 at [0082]; Ex. 1002, 78. Thus, in each of
these embodiments, Burrough discloses gesture signals that comprise vector
signals. Id.
Claim Language
3. The method of
claim 1 wherein the
first or second
gesture signal
comprises an onscreen signal.

Burrough
Burrough discloses that the touch sensitive surface can be a
touch screen, and the GUI object can be displayed on the
touch screen As a result, when the fingers are moved
apart, the zoom-in signal can be used to increase the size of
the embedded features in the GUI object and when the
fingers are pinched together, the zoom-out signal can be
used to decrease the size of embedded features in the
object. Ex. 1005 at [0081]; see also Figs. 12A-H.
See also Ex. 1002, 79-80.

21

A POSITA would understand that the signals representing the users fingers
(first and second gesture signals) are on-screen signals, because the signals
represent one or more touches on the touch screen. Ex. 1002, 80.
Claim Language
4. The method of
claim 1 wherein
generating a dynamic
interaction parameter
comprises generating
a dynamic interaction
parameter from a
difference between
the first gesture signal
and the second
gesture signal.

Burrough
Burrough discloses that [f]ollowing block 1106, the zoom
gesture method 1100 proceeds to block 1108 where the
distance between at least the two fingers is compared. The
distance may be from finger to finger or from each finger
to some other reference point as for example the centroid.
Ex. 1005 at [0080].
Burrough further discloses For instance, as the fingers
spread apart or closes together, the object zooms in or
zooms out at the same time and the corresponding haptic
effect will change. Id. at [0081].
See also Ex. 1002, 81-84.

A POSITA would appreciate that the distance between the users fingers is
calculated by a difference between the two position signals (gesture signals). Ex.
1002, 83. And, as established in connection with limitation 1.c, the haptic
response H(d) (dynamic interaction parameter) is generated as a function of this
distance.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation. It
would have been obvious to a POSITA that the distance between the users fingers
can be calculated by taking the difference between the position of the fingers. Ex.
1002, 84. Motivation to do so comes from, for example, Burroughs disclosure
22

that touch signals may indicate the location of the finger on the touch sensitive
screen. Ex. 1005 at [0042].
Claim Language
5. The method of
claim 1 wherein
generating a dynamic
interaction parameter
comprises generating a
dynamic interaction
parameter using the
first gesture signal and
the second gesture
signal and a physical
model.

Burrough
Burrough discloses that multi-touch devices monitor a
sensing surface for a touch or near touch, and when a
touch occurs determines the distinct areas of contact and
identifies the contacts via their geometric features and
geometric arrangement. Once identified or classified, the
contacts are monitored for various motions, actions or
events. Ex. 1005 at [0005]; see also [0054] (describing
placement of sensing regions based upon size of a hand);
[0063]-[0070] (describing equation modeling pressure of
finger on the touchscreen).
See also Ex. 1002, 85-87.

As established in connection with claim 1, Burrough discloses generating a


dynamic interaction parameter using gesture signals generated by a users fingers
touching a multi-touch touchscreen. The multi-touch touch screen determines the
distinct areas of contact and identifies the contacts based upon a physical model
of a finger, such as their geometric features and geometric arrangement. Ex.
1005 at [0005]. Thus, Burrough discloses this limitation under Immersions
interpretation of the limitation discussed above in Section III.G.5. Ex. 1002, 86.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Burroughs disclosures. For example, it would have been obvious to a
POSITA to use the geometric properties of a human finger to, for example,

23

determine the center location of each finger when calculating the distance between
two fingers. Ex. 1002, 87. Motivation to do so comes from, for example,
Burroughs disclosure that the location of the users fingers will associate or lock
the fingers to a particular GUI object being displayed, which object is zoomed in
or out based on the movement of the users fingers. Ex. 1005 at [0081]; Ex. 1002,
87.
Claim Language
6. The method of claim 1
wherein generating a
dynamic interaction
parameter comprises
generating a dynamic
interaction parameter
using the first gesture
signal and the second
gesture signal and an
animation.

Burrough
Burrough discloses that vibrations can be mapped to
animation effects occurring on display 112 (rubber
band, bounce etc.) Ex. 1005 at [0051].
Burrough also discloses that zooming typically can
occur substantially simultaneously with the motion of
the objects. For instance, as the fingers spread apart or
closes together, the object zooms in or zooms out at the
same time and the corresponding haptic effect will
change. Id. at [0081].
See also Ex. 1002, 88-91.

A POSITA would understand that haptic effects mapped to animation


effects are haptic effects coordinated with the displayed animation. Ex. 1005 at
[0051]; Ex. 1002, 90. Likewise, a POSITA would understand that haptic effects
coordinated with zoom animations on the display, such as zooming in or out of a
graphical object, also are haptic effects coordinated with the displayed animation.
Ex. 1005 at [0081]; Ex. 1002, 90. Thus, Burrough discloses this limitation under
Immersions interpretation of the limitation, discussed above in Section III.G.6.
Id.
24

If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Burroughs disclosures. For example, it would have been obvious to a
POSITA to use an animation, such as a zoom in or zoom out animation, to generate
the haptic response (dynamic interaction parameter), for example, to create haptic
effects coordinated with the displayed animation. Ex. 1002, 91. Motivation to
do so comes from, for example, Burroughs disclosure that as the object zooms in
or zooms out at the same time and the corresponding haptic effect will change.
Id.; Ex. 1005 at [0081].
Claim Language
7. The method of
claim 1 further
comprising:
receiving a first
device sensor
signal;

Burrough
Burrough discloses that the touch sensitive surface is
arranged to receive different types of user touch events each
being characterized by an amount of pressure applied on the
touch sensitive surface by a user. Ex. 1005 at [0016].
Burrough further discloses that haptic device 300 can be
used as a pressure sensor simply by sensing a voltage Vp
generated by the displacement dY of member 306 caused by
force F applied to the surface of surface 126. In this way, by
monitoring voltage Vp, haptic device 300 can be configured
to act as an integrated haptic actuator/pressure sensor
arranged to change operational modes (passive to active, and
vice versa). Id. at [0070]; see also Fig. 4.
Burrough further discloses that in the zoom gesture method
1100, the nature of the multi-touch event can be
determined based upon either the presence of at least two
fingers indicating that the touch is gestural (i.e. multi-touch)
rather than a tracking touch based on one finger and/or by
the pressure asserted by the fingers on the surface 126. The
pressure asserted by the fingers on the touch screen can be
determined by monitoring the voltage Vp described above.
25

Id. at [0079].
See also Ex. 1002, 92-94.
receiving a second
device sensor
signal; and

See limitation 7.a.

Thus, Burrough discloses that haptic actuator 300 can detect the pressure
applied by each finger on a touchscreen by monitoring a voltage Vp (first and
second device sensor signals). Ex. 1002, 96.
Claim Language
wherein generating
a dynamic
interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and
the first device
sensor signal and
the second device
sensor signal.

Burrough
Burrough discloses that pressure information can be linked
with haptic feedback. For example, vibro-tactile sensation
can be increased with increasing pressure, and vice versa.
Accordingly, when a user exerts increased pressure (i.e.,
presses harder) on a surface, the amount of vibration felt by
the user increases thereby informing the user that they are
pressing harder. Ex. 1005 at [0071]
Burrough further discloses that in the zoom gesture
embodiment [i]f it is determined at block 1104 that the
presence of the two fingers represents a gesture, then the
haptic devices nearest the touch point are set to active mode
in order to provide a vibro-tactile response at 1106 to each
of the fingers during the gesture. In the described
embodiment, the vibro-tactile response provided to each
finger can have the same profile or different profiles. For
example, if it the pressure applied by one finger is
substantially greater than that applied by the other finger,
then the vibro-tactile response for the two fingers can be
different due to the varying pressure applied by each
finger. Id. at [0079].
See also Ex. 1002, 97-101.

As established in connection with limitation 1.c, Burrough discloses that in


the zoom embodiment, the haptic effect for each finger is dependent upon a haptic
26

profile that varies the haptic effect as a function of the distance between fingers
(calculated from the first and second gesture signals). Burrough further discloses
that the two fingers can have different profiles based upon the pressure applied by
each finger (first and second device sensor signals). Ex. 1005 at [0079]. A
POSITA would therefore understand that the haptic response H(d) for each finger
(dynamic interaction parameters) is generated using first and second gesture
signals (representing positions of fingers) and first and second device sensor
signals (representing pressures applied by fingers). Ex. 1002, 99.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Burroughs disclosures. For example, it would have been obvious to a
POSITA to generate the haptic response (dynamic interaction parameter) based
upon both the distance between the users fingers in a zoom gesture and based
upon the pressure applied by each finger, for example to create a haptic effect for
each finger that varies both as a function of distance and as a function of pressure.
Ex. 1002, 100. Motivation to do comes from, for example, Burroughs
disclosures that the haptic effect for each finger in a zoom gesture increases
linearly with distance d based on a haptic profile for the finger (Ex. 1005 at
[0082]) and that the haptic response provided to each finger can have different
profiles (id. at [0079]). Ex. 1002, 101.
27

Claim Language
12.pre. A haptic
effect enabled
system comprising:

Burrough
Burrough discloses that the invention relates, in one
embodiment, to an apparatus and method for providing
multi-touch haptic feedback. Ex. 1005 at [0010]; [0003];
see also Fig. 1.
Burrough further discloses that the described embodiments
generally pertain to gestures and methods of implementing
gestures with associated physical feedback with touch
sensitive devices. Id. at [0035].

12.a. a haptic
output device;

12.b. a drive
module
electronically
coupled to the
haptic output device
for receiving a first
gesture signal,
receiving a second
gesture signal, and
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal; and

See also Ex. 1002, 102-103.


Burrough discloses that the touch sensitive input device
communicates with an array of haptic feedback devices
(also referred to as haptic actuators) each arranged to
provide haptic feedback according to a haptic profile in
response to a multi-touch event. Ex. 1005 at [0035].
See also Ex. 1002, 104.
Burrough discloses that the invention is preferably
implemented by hardware, software or a combination of
hardware and software. Ex. 1005 at [0085].
Burrough further discloses an electronic device comprising a
processor 106 that can operate (in conjunction with an
operating system) to execute computer code and produce
and use data. Id. at [0038]; see also Figs. 1A-1E.
Also see discussion of receiving first and second gesture
signals and generating a dynamic interaction parameter
using these signals in claim 1.a, 1.b and 1.c.
See also Ex. 1002, 105-108.

As established in connection with claim 1, Burrough discloses a method for


receiving first and second gesture signals and generating a dynamic interaction
parameter. Burrough further discloses that the method is implemented in an
28

electronic device comprising a processor configured to execute computer code.


A POSITA would recognize that this computer code is a drive module, i.e. a set of
instructions executed by the processor. Ex. 1002, 108.
Claim Language
12.c. a drive circuit
electronically
coupled to the drive
module and the
haptic output device
for applying a drive
signal to the haptic
output device
according to the
dynamic interaction
parameter.

Burrough
Burrough discloses a micro-controller 132 (drive circuit)
which can use touch information Tinfo to query haptic data
base 134 that includes a number of predetermined haptic
profiles each of which describes a specific haptic response
H, in terms of duration of response, type of vibro-tactile
response, strength of response, etc. In this way, the
response of haptic actuator 136 can be controlled in real
time by microprocessor 132 by establishing the duration,
strength, type of vibro-tactile response Hx. Ex. 1005 at
[0047].
Also see discussion of applying a drive signal to the haptic
output device according to the dynamic interaction in claim
1.a, 1.b and 1.d.
See also Ex. 1002, 109-111.

A POSITA would understand that microcontroller 132 is the claimed drive


circuit, because it is comprised of circuitry for applying drive signals in the form
of voltages to the haptic actuators (haptic output device). Ex. 1002, 111.
Claim Language
13. The system of claim 12 wherein the first or
second gesture signal comprises a vector signal.
14. The system of claim 12 wherein the first or
second gesture signal comprises an on-screen signal.
15. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter from a difference
between the first gesture signal and the second
gesture signal.
29

Burrough
See claim 2.
See claim 3.
See claim 4.
Also see discussion of drive
module in claim 12.b.

16. The system of claim 12 wherein the drive


module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and a physical
model.
17. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and an
animation.
18.a. The system of claim 12 wherein the drive
module comprises a drive module for receiving a
first device sensor signal,
18.b. receiving a second device sensor signal, and
18.c. generating a dynamic interaction parameter
using the first gesture signal and the second gesture
signal and the first device sensor signal and the
second device sensor signal.

Claim Language
23.pre. A nontransitory computer
readable medium
having instructions
stored thereon that,
when executed by a
processor, causes
the processor to
produce a haptic
effect, the
instructions
comprising:

See claim 5.
Also see discussion of drive
module in claim 12.b.
See claim 6.
Also see discussion of drive
module in claim 12.b.
See limitation 7.a.
Also see discussion of drive
module in claim 12.b.
See limitation 7.b.
See limitation 7.c.

Burrough
Burrough discloses that the invention is preferably
implemented by hardware, software or a combination of
hardware and software. The software can also be embodied
as computer readable code on a computer readable
medium. Ex. 1005 at [0085].
Burrough further discloses that the computer code and data
can reside within a memory 108 that can be operatively
coupled to processor 106. By way of example, memory 108
can include Read-Only Memory (ROM), Random-Access
Memory (RAM), flash memory, hard disk drive and/or the
like. Id. at [0038].
Also see discussion of producing haptic effects in limitation
1.pre.
See also Ex. 1002, 120-123.
30

A POSITA would appreciate that memory, such as, e.g. ROM, flash memory
and hard disk drives, each comprise non-transitory computer readable media, and
that computer code stored on the computer readable media comprise instructions
executable by a processor. Ex. 1002, 123.
Claim Language
Burrough
23.a. receiving a first gesture signal;
See limitation 1.a.
23.b. receiving a second gesture signal;
See limitation 1.b.
23.c. generating a dynamic interaction parameter
See limitation 1.c.
using the first gesture signal and the second gesture
signal; and
23.d. applying a drive signal to a haptic output
See limitation 1.d.
device according to the dynamic interaction
parameter.
24. The non-transitory computer readable medium of See claim 2.
claim 23, wherein the first or second gesture signal
comprises a vector signal.
25. The non-transitory computer readable medium of See claim 3.
claim 23, wherein the first or second gesture signal
comprises an on-screen signal.
26. The non-transitory computer readable medium of See claim 4.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter from a difference between the
first gesture signal and the second gesture signal.
27. The non-transitory computer readable medium of See claim 5.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and a physical model.
28. The non-transitory computer readable medium of See claim 6.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and an animation.
29.a. The non-transitory computer readable medium See limitation 7.a.
31

of claim 23, further comprising: receiving a first


device sensor signal;
29.b. receiving a second device sensor signal; and
29.c. wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and the first device
sensor signal and the second device sensor signal.
I.

See limitation 7.b.


See limitation 7.c.

Ground 2: Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29 are
Obvious Under 35 U.S.C. 103(a) (pre-AIA) in Light of
Rosenberg 373.

Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29 are rendered obvious by
U.S. Patent No. 5,734,373 to Rosenberg (Rosenberg373), which is an
Immersion patent dating to 1998. Rosenberg 373 is prior art to the 571 patent
under 35 U.S.C. 102(b) (pre-AIA) because it issued on March 31, 1998, more
than one year before the earliest possible priority date of the 571 patent (Aug. 23,
2012). Ex. 1004 at cover.
Rosenberg 373 discloses a system directed to controlling and providing
force feedback to a user operating a human/computer interface device, such as a
joystick, mouse, simulated medical instrument, stylus, or other object. Ex. 1004
at 3:25-27; 3:50-53. An example of such a system is shown in Fig. 1:

32

Ex. 1004 at Fig. 1. As illustrated in Fig. 1, user 22 can manipulate and move the
user interface device (user object 34) to interface with the host application
program the user is viewing on display screen 20. Id. at 13:50-53. Sensors 28
sense the position, motion and other characteristics of the user interface device,
and generate sensor data which can include position values, velocity values,
and/or acceleration values in one or more degrees of freedom. Id. at 10:10-14;
33

15:50-60.
Rosenberg 373 further discloses a reflex process or force sensation
process for providing force feedback to the user through the user interface device.
Id. at 17:2-11; 4:50-56. The force feedback can be based on parameters, such as
the received sensor data and timing data. Id. at 17:6-11. Rosenberg 373 discloses
various algorithms for calculating a force value representing the force feedback
to be provided to the user. Id. at 17:6-21. For example, the force value can vary
linearly or nonlinearly with the position, velocity or acceleration of the user object.
Id. at 17:11-21. The force value may be provided to microprocessor 26, which
converts the force value into an appropriate form usable by haptic actuators 30,
which transmit forces to user object 34 of the interface device 14 in response
to signals received from microprocessor 26. Id. at 21:65-66; 11:54-57. Thus,
Rosenberg 373 discloses generating a dynamic interaction parameter based on
signals corresponding to a users gestures and using it to generate dynamic haptic
effects.
As discussed below, Rosenberg 373 discloses and/or renders obvious all of
the limitations of challenged claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29.
Claim Language
1.pre. A method of
producing a haptic
effect comprising:

Rosenberg 373
Rosenberg 373 discloses that the present invention relates
generally to interface devices between humans and
computers, and more particularly to computer interface
devices that provide force feedback to the user. Ex. 1004
at 1:22-25; see also 3:25-27.
34

See also Ex. 1002, 142-143.


1.a. receiving a first Rosenberg 373 discloses a user object such as a
gesture signal;
joystick, mouse, simulated medical instrument, stylus, or
other object that is preferably grasped and moved by the
user in one or more degrees of freedom (gestures). Ex.
1004 at 3:50-53; see also 13:44-55.
Rosenberg 373 further discloses that the user 22 can
manipulate and move the object along provided degrees of
freedom to interface with the host application program the
user is viewing on display screen 20. Id. at 13:50-53.
Rosenberg 373 discloses that [s]ensors 28 sense the
position, motion, and/or other characteristic of a user object
34 of interface device 14 along one or more degrees of
freedom and provide signals including information
representative of those characteristics. Id. at 10:10-14.
Rosenberg 373 discloses sensor data, which can include
position values, velocity values, and/or acceleration values
derived from sensors 28 which detect motion of object 34 in
one or more degrees of freedom and can include a history
of values, such as position values recorded previously and
stored in order to calculate a velocity (a first gesture
signal). Id. at 15:50-60; see also 20:61-21:18.
Rosenberg 373 discloses that the local processor 26
continually receives signals from sensors 28, processes the
raw data, and sends processed sensor data to host computer
12 or [a]lternatively, local processor 26 sends raw data
directly to host computer system 12. Id. at 15:46-50.

1.b. receiving a
second gesture
signal;

See also Ex. 1002, 144-148.


See limitation 1.a.

Rosenberg 373 discloses that microprocessor 26 continually receives

35

signals from sensors 28, as the user manipulates user object 34, which is a user
interface device such as a joystick and thus corresponds to the user interface
described in the 571 patent . Ex. 1004 at 15:46-50; Ex. 1002, 150. A POSITA
would understand that sensor data from sensors 28 are gesture signals, because
they indicate user interaction with user object 34. Id.
Because the gesture signals are continually received from sensors 28 as the
user manipulates the user object 34, multiple (i.e. at least first and second) gesture
signals are received from any single sensor 28 by local microprocessor 26. Id.
Rosenberg 373 further explains that the raw data representing these gesture
signals can also be received by host computer 12. Ex. 1004 at 15:46-50.
Claim Language
1.c. generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal; and

Rosenberg 373
Rosenberg 373 discloses that [t]he sensor data read in step
78 informs the host computer 12 how the user is interacting
with the application program. From the position of object
34 sensed over time, the host computer system 12 can
determine when forces should be applied to the object. Ex.
1004 at 16:21-25.
Rosenberg 373 discloses a reflex process or force
sensation process for providing force commands
dependent on other parameters, such as sensor data. Id. at
17:2-5; see also 4:50-56.
Rosenberg 373 discloses that force sensation processes
can include a force algorithm to calculate a force
value based on sensor and timing data. Id. at 17:6-11.
For example, Rosenberg 373 discloses [a]lgorithms in
which force varies linearly (or nonlinearly) with the velocity
of object 34 and [a]lgorithms in which force varies
36

with the acceleration of object 34 (generating a dynamic


interaction parameter using the first gesture signal and the
second gesture signal). Id. at 17:14-21.
Rosenberg 373 further discloses calculating a velocity
and/or acceleration from a number of past position
values (using the first gesture signal and the second gesture
signal). Id. at 17:37-45; see also 15:59-60; 18:10-12.
See also Ex. 1002, 151-157.
Rosenberg 373 discloses that a force value representing the force of the
haptic effect to be applied to user object 34 can be calculated based upon force
algorithms which define the force value as a function (e.g. linear or nonlinear) of
velocity or acceleration. Ex. 1002, 155. Rosenberg 373 further discloses that the
velocity or acceleration of the user object can be calculated based upon the change
in position of the user object 34 over time. Ex. 1004 at 17:37-45. Specifically,
Rosenberg 373 teaches that the host computer may store past position values,
and the times associated with those past position values and use that stored
information to calculate velocity and acceleration. Id. at 17:37-45; 15:59-60;
18:10-12; Ex. 1002, 155. Rosenberg 373 notes that velocity and acceleration
can be computed using the stored position data and timing data as is well known to
those skilled in the art. Ex. 1004 at 21:20-22.
Thus, Rosenberg 373 discloses generating a force value as a function of
velocity or acceleration calculated using multiple position values (first and second
gesture signals). The force value is a dynamic interaction parameter at least
37

under Immersions construction discussed above in Section III.G.2, because it is


an interaction parameter that changes over time or reacts in real time, based upon
the users interaction with the user interface device. Ex. 1002, 156.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on the Rosenberg 373 disclosures above. Ex. 1002, 157. It would have been
obvious to a POSITA that velocity and acceleration could be calculated from two
or more position values (first and second gesture signals) and timing signals. Id.
Motivation to do so comes from, for example, Rosenbergs disclosure of
embodiments in which velocity and acceleration are calculated from a number of
past position values. Id.; Ex. 1004 at 17:37-45; 15:59-60; 18:10-12.
Claim Language
1.d. applying a
drive signal to a
haptic output device
according to the
dynamic interaction
parameter.

Rosenberg 373
Rosenberg 373 discloses force feedback interface device
14 that includes actuator 30 (haptic output device). Ex.
1004 at Figs. 1, 3.
Rosenberg 373 further discloses that [a]ctuators 30
transmit forces to user object 34 of the interface device 14 in
one or more directions alone one or more degrees of
freedom in response to signals received from
microprocessor 26. Id. at 11:54-57; see also 11:57-12:24.
Rosenberg 373 discloses that a low-level force command
determined in step 82 is output to microprocessor 26 over
bus 24. This force command typically includes a force value
that was determined in accordance with the parameters
described above. Id. at 19:64:20:1.
Rosenberg 373 discloses that the force command can be
38

output as an actual force signal that is merely relayed to an


actuator 30 by microprocessor 26; or, the force command
can be converted to an appropriate form by microprocessor
26 before being sent to actuator 30. Id. at 20:1-5; see also
21:65-66; 25:18-19.
Rosenberg 373 further discloses an actuator interface 38
that can be optionally connected between actuators 30 and
microprocessor 26. Interface 38 converts signals from
microprocessor 26 into signals appropriate to drive actuators
30. Id. at 12:31-38.
See also Ex. 1002, 158-165.
Rosenberg 373 discloses that host computer 12 provides a force
command to microprocessor 26. The force command may include a force value
determined in accordance with the parameters described above. Ex. 1004 at
19:65-20:1. Microprocessor 26 may relay the force command to the haptic
actuator 30, or convert the force command into an appropriate form for driving
the actuator before sending it to the actuator. Ex. 1004 at 20:1-5. In either case,
microprocessor 26 applies a signal to the haptic actuator (drive signal) to cause the
actuator to generate force feedback that may be based upon a force value
received from the host computer. Ex. 1002, 163. In other words, Rosenberg 373
discloses a microprocessor that applies a drive signal to the haptic actuator (haptic
output device) according to the force value (dynamic interaction parameter). Id.
Rosenberg 373 also discloses an alternative embodiment in which actuator
interface 38 is connected between microprocessor 26 and haptic actuator 30. Ex.

39

1004 at 12:31-38. Microprocessor 26 may send signals based upon the received
force value to an actuator interface 38, which converts signals from the
microprocessor into signals appropriate to drive actuators 30 (drive signal). Id.;
Ex. 1002, 164. In this embodiment, actuator interface 38 applies a drive signal to
the haptic actuators (haptic output device) to generate force feedback based upon
the force value (dynamic interaction parameter). Id.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on Rosenbergs disclosures above. It would have been obvious to a POSITA that a
drive signal could be applied by the microprocessor 26 or actuator interface 38 to
the haptic actuators according to the force value (dynamic interaction parameter)
calculated by the host computer, for example, to generate the force feedback
specified by the force value. Ex. 1002, 165. Motivation to do so comes from, for
example, Rosenbergs disclosure of output[ing] a force on the user object by
sending the computed force value to the actuators. Id.; Ex. 1004 at 4:50-56.
Claim Language
2. The method of
claim 1 wherein the
first or second
gesture signal
comprises a vector
signal.

Rosenberg 373
Rosenberg 373 discloses that the sensor data read by host
computer 12 in step 78 can include position data, velocity
data, and acceleration data. Ex. 1004 at 17:30-35.
Rosenberg 373 discloses that velocity sensors and/or
accelerometers can be used to directly measure velocities
and accelerations on object 34. Id. at 11:19-24.
Rosenberg 373 further discloses an embodiment where, if
40

the user is controlling a simulated race car, the position of


the user object joystick determines if the race car is moving
into a wall and thus if a collision force should be generated
on the joystick. In addition, the velocity and/or acceleration
of the user object can influence whether a change in force on
the object is required. Id. at 16:28-34.
See also Ex. 1002, 166-171.
Rosenberg 373 discloses embodiments in which both position data (first
gesture signal) and velocity (second gesture signal) are used to generate a force
value (dynamic interaction parameter. For example, in the simulated car race
embodiment the position of the user object joystick determines in the race care is
moving into a wall, and the velocity and/or acceleration of the user object can
influence whether a change in force on the object is required. Id. at 16:28-34; Ex.
1002, 169. A POSITA would understand that a signal from a velocity sensor
representing the velocity of user object 34 is a vector signal, because velocity has
both a magnitude and a direction. Id.; Ex. 1008 (Barrons Dictionary) (velocity:
the velocity vector represents the rate of change of position of an object. To
specific a velocity it is necessary to specify both a speed and a direction); Ex.
1009 (American Heritage Dictionary) (velocity 2. Physics a vector quantity
whose magnitude is a bodys speed and whose direction is the bodys direction of
motion). Even in embodiments which employ user interface devices having
velocity sensors corresponding to a single degree of freedom (e.g., Ex. 1004 at
11:19-24), a gesture signal from such a single sensor is a vector as it represents
41

both a speed and a direction (positive or negative) along that degree of freedom.
Ex. 1002, 170. Similarly, a POSITA would understand that a signal from an
acceleration sensor representing acceleration of user object 34 is a vector signal
because it includes both a magnitude and a direction of the measured acceleration.
Id.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on Rosenbergs disclosures above. It would have been obvious to a POSITA that a
force value could be generated based upon a sensed position and a sensed velocity,
for example, to implement the car race embodiment described above. Ex. 1002,
171. Motivation to do so comes from, for example, Rosenbergs disclosure of an
embodiment in which force feedback varies based upon both the position and
velocity of the user object, and Rosenbergs disclosure of velocity sensors that can
sense the velocity of the user object. Id.; Ex. 1004 at 16:28-34; 11:19-24.
Claim Language
4. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter from a
difference between
the first gesture

Rosenberg 373
Rosenberg 373 discloses that [i]n an alternate
embodiment, the sensor data read in step 78 includes
position data and no velocity or acceleration data, so that
host computer 12 is required to calculate the velocity and
acceleration from the position data. This can be
accomplished by recording a number of past position values,
recording the time when each such position value was
received using the system clock 18, and calculating a
velocity and/or acceleration from such data. Ex. 1004 at
17:37-45.
42

signal and the


second gesture
signal.

Rosenberg 373 further discloses that the host computer


recalls the previous position of user object 34 (along a
particular degree of freedom), examine[s] the current
position of the user object, and calculate[s] the difference in
position. Id. at 17:61-64; see also18:22-26.
See also Ex. 1002, 172-174.

Thus, Rosenberg 373 discloses that the force value (dynamic interaction
parameter) can be generated based on the velocity of the user object, calculated
from the difference between the position of the user object at two points in time
(first and second gesture signals). Ex. 1002, 174.
Claim Language
5. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and a
physical model.

Rosenberg 373
Rosenberg 373 discloses that the reflex processes can
be used to provide a variety of haptic sensations to the user
through the user object 34 to simulate many different types
of tactile events, such as virtual damping, virtual
obstruction and virtual texture. Ex. 1004 at 19:39-46.
As one example, Rosenberg 373 discloses a kinematic
equation which calculates force based on the velocity of the
user object multiplied by a damping constant which can
simulate motion of object 34 along one degree of freedom
through a fluid or similar material. Id. at 17:46-51.
Rosenberg 373 also discloses various conditions that set
up a basic physical model or background sensations about
the user object including simulated stiffness, simulated
damping, simulated inertias, deadbands where simulated
forces diminish, and directional constraints dictating the
physical model's functionality. Id. at 31:17-23; see also
17:56-18:4 (movement through liquid); 40:16-41:12 (paddle
and ball); 39:45-49; 40:22-25 (gravity).
See also Ex. 1002, 175-178.

43

Thus, Rosenberg 373 discloses that the force algorithms used to generate
force values (dynamic interaction parameter) can use both sensor data (first and
second gesture signals) and mathematical equations for simulating physical effects
(physical model), such as motion through liquid, inertia or gravity. Ex. 1004 at
17:46-51; 31:17-23; 39:45-47; Ex. 1002, 178.
Claim Language
6. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and
an animation.

Rosenberg 373
Rosenberg 373 discloses that force feedback can be
accurately coordinated with other supplied feedback, such as
images on the video screen, and with user inputs such as
movement of the object. Ex. 1004 at 2:15-18.
Rosenberg 373 further discloses that the host computer 12
preferably synchronizes any appropriate visual feedback
with the application of forces on user object 34. For
example, in a video game application, the onset or start of
visual events, such as an object colliding with the user on
display screen 20 should be synchronized with the onset or
start of forces felt by the user which correspond to or
complement those visual events. Id. at 20:23-32; see also
19:46-53 (simulating virtual obstruction); 33:17-21 (same);
37:26-28 (paddle and ball).
See also Ex. 1002, 179-182.

Thus, Rosenberg 373 discloses haptic effects that are coordinated with
displayed animations. Ex. 1004 at 10:28-31; Ex. 1002, 181.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Rosenbergs disclosures. For example, it would have been obvious to a
POSITA to use an animation, such as an object colliding with the user on the
44

display screen, to generate a force value (dynamic interaction parameter), for


example, to create haptic effects coordinated with the displayed animation. Ex.
1002, 182. Motivation to do so comes from, for example, Rosenbergs disclosure
that the host computer 12 preferably synchronizes any appropriate visual
feedback with the application of forces on user object 34. Id.; Ex. 1004 at
20:23-32.
Claim Language
7. The method of
claim 1 further
comprising:
receiving a first
device sensor
signal;

Rosenberg 373
Rosenberg 373 discloses that velocity sensors and/or
accelerometers can be used to directly measure velocities
and accelerations on object 34. Analog sensors can provide
an analog signal representative of the
position/velocity/acceleration of the user object in a
particular degree of freedom. Ex. 1004 at 11:19-24; see
also 21:3-7.
Rosenberg 373 discloses that [t]ypically, a sensor 28 is
provided for each degree of freedom along which object 34
can be moved. Id. at 10:17-18.
Rosenberg 373 further discloses [o]bject 34 is shown in
FIG. 3 as a joystick having a grip portion 126 for the user to
grasp. A user can move the joystick about axes A and B.
Id. at 28:25-27.

receiving a second
device sensor
signal; and

See also Ex. 1002, 183-185.


See limitation 7.a.

Rosenberg 373 discloses that sensors 28 can include velocity and/or


accelerometer sensors that generate signals representing the velocity or
acceleration of the user object along a particular degree of freedom. Ex. 1004 at

45

11:19-24; see also 21:3-7. Thus, a POSITA would understand that in a user object
with more than one degree of freedom, such as a joystick which can move along
both the x and y axes (e.g., id. at 28:25-27), the velocity and/or acceleration
sensors would generate separate signals for each degree of freedom (first and
second device sensor signals) representing the velocity and/or acceleration of the
object along the degree of freedom. Ex. 1002, 187.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Rosenbergs disclosures. For example, it would have been obvious to a
POSITA to include a velocity sensor for each axis of a joystick, for example, to
separately sense the velocity of the joystick along each axis. Ex. 1002, 188.
Motivation to do comes from, for example, Rosenbergs disclosure that
[t]ypically, a sensor 28 is provided for each degree of freedom along which object
34 can be moved. Id.; Ex. 1004 at 10:17-18.
Claim Language
wherein generating
a dynamic
interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and

Rosenberg 373
Rosenberg 373 discloses various conditions that set up a
basic physical model or background sensations about the
user object including simulated stiffness, simulated
damping, simulated inertias, deadbands where simulated
forces diminish, and directional constraints dictating the
physical model's functionality. Ex. 1004 at 31:17-24;
24:54-57 ( reflex processes may include conditions.)
Rosenberg 373 discloses a restoring spring condition, in
which force varies linearly over an appreciable portion of
the user object's displacement, and is proportional to the
46

the first device


sensor signal and
the second device
sensor signal.

object 34's distance from the origin position O. Id. at


33:28-37.
Rosenberg 373 also discloses a sluggish condition in
which the force creates a damping force on user object 34
having a magnitude proportional to the velocity of the user
object when moved by the user. The degree of
"viscosity" of the sluggish force can be specified by a
viscous damping coefficient. Id. at 33:53-67.
Rosenberg 373 discloses that [m]ultiple conditions may be
specified in a single command to effectively superpose
condition forces. Id. at 31:22-24; see also 49:56-63
(illustrative application applying both sluggish and spring
conditions); Fig 23.
See also Ex. 1002, 189-195.

It would have been obvious to a POSITA to practice this limitation in view


of Rosenbergs disclosures. Rosenberg 373 discloses that haptic effects can be
specified as a combination of various conditions, such as sluggish and spring
conditions. Ex. 1004 at 31:22-24; 49:56-61; Ex. 1002, 193. Rosenberg 373
further discloses that the sluggish condition applies a force along one degree of
freedom whose magnitude varies with the velocity of user object along that degree
of freedom (id. at 33:53-67) and that the spring condition applies a force along one
degree of freedom whose magnitude varies with position of the user object along
that degree of freedom. Id. at 33:53-67.
Rosenberg 373 discloses an embodiment in which sluggish and spring
conditions are combined along one degree of freedom (the x axis) to provide both

47

sluggish and spring haptic effects:

Ex. 1004 at Fig. 23; 49:56-61. As illustrated in Fig. 23, the sluggish condition
includes parameters to specify a sluggish force along either the x-axis (Bx
parameters) or y-axis (By parameters). Likewise, the spring condition includes
parameters to specify a spring force along either the x-axis (Kx parameters) or yaxis (Ky parameters). Thus, it would have been obvious to a POSITA to apply a
sluggish condition along both the x and y axes, such that the force feedback varies
as a function of velocity along both axes (first and second device sensor signals),
as well as a spring condition along both the x and y axes, such that the force
feedback also varies as a function of position along both axes (first and second
gesture signals). Ex. 1002, 195. Motivation to do so comes from, for example,
Rosenbergs teaching that a condition command can be used for each provided

48

degree of freedom of user object 34. Id.; Ex. 1004 at 34:48-50.


Claim Language
12.pre. A haptic
effect enabled
system comprising:

12.a. a haptic
output device;

Rosenberg 373
Rosenberg 373 discloses that the present invention is
directed to controlling and providing force feedback to a
user operating a human/computer interface device.) Ex.
1004 at 3:25-27; see also 6:18-20; Fig. 1.
See also Ex. 1002, 196.
Rosenberg 373 discloses force feedback interface device
14 that includes actuator 30 (haptic output device). Ex.
1004 at Figs. 1, 3.
Rosenberg 373 further discloses that [a]ctuators 30
transmit forces to user object 34 of the interface device 14 in
one or more directions alone one or more degrees of
freedom in response to signals received from
microprocessor 26. Id. at 11:54-57; see also 11:57-12:24.

See also Ex. 1002, 197-198.


Rosenberg 373 discloses that [h]ost computer system 12
12.b. a drive
preferably includes a host microprocessor 16, random access
module
memory (RAM) 17, read-only memory (ROM) 19 Ex.
electronically
1004 at 6:47-51; see also 6:55-57 (the microprocessor
coupled to the
haptic output device preferably retrieves and stores instructions in RAM 17
for receiving a first and ROM 19).
gesture signal,
Rosenberg 373 further discloses that the reflex process
receiving a second
or force sensation process, as referred to herein, is a set of
gesture signal, and
instructions for providing force commands dependent on
generating a
dynamic interaction other parameters. such as sensor data read in step 78 and
parameter using the timing data from clock 18. Id. at 17:2-6.
first gesture signal
Also see discussion of receiving first and second gesture
and the second
signals and generating a dynamic interaction parameter
gesture signal; and
using these signals in claim 1.a, 1.b and 1.c.
See also Ex. 1002, 199-202.
As established in connection with claim 1, Rosenberg 373 discloses a
49

method for receiving first and second gesture signals and generating a dynamic
interaction parameter. Rosenberg 373 further discloses that the method is
implemented using a set of instructions executed on microprocessor 16. Ex.
1004 at 6:47-51. A POSITA would recognize that this set of instructions is a drive
module, i.e. a set of instructions executed by the processor. Ex. 1002, 202.
Claim Language
12.c. a drive circuit
electronically
coupled to the drive
module and the
haptic output device
for applying a drive
signal to the haptic
output device
according to the
dynamic interaction
parameter.

Rosenberg 373
Rosenberg 373 discloses that the force command can be
output as an actual force signal that is merely relayed to an
actuator 30 by microprocessor 26; or, the force command
can be converted to an appropriate form by microprocessor
26 before being sent to actuator 30. Id. at 20:1-5; see also
21:65-66; 25:18-19.
Rosenberg 373 further discloses that [a]ctuator interface
38 can be optionally connected between actuators 30 and
microprocessor 26. Interface 38 converts signals from
microprocessor 26 into signals appropriate to drive actuators
30 (drive signals). Id. at 12:31-38; see also Fig. 2.
Also see discussion of applying a drive signal to the haptic
output device according to the dynamic interaction
parameter in claim 1.d.
See also Ex. 1002, 203-206.

Rosenberg 373 describes several alternative embodiments for controlling


the haptic actuators. In one embodiment, microprocessor 26 applies drive signals
to drive the haptic actuators. Ex. 1004 at 20:1-5. A POSITA would understand
that microprocessor 26 is the claimed drive circuit, because it is comprised of
circuitry for applying drive signals in the form of voltages to the haptic actuators

50

(haptic output device). Ex. 1002, 206. Rosenberg 373 also discloses an
embodiment in which actuator interface 38 applies drive signals to the haptic
actuators. Ex. 1004 at 12:31-38. A POSITA would understand that actuator
interface 38 also satisfies the drive circuit limitation, because it is comprised of
circuitry for applying drive signals in the form of voltages to the haptic actuators
(haptic output device). Id.; see also Fig. 2 (illustrating circuitry for actuator
interface 38); Ex. 1002, 206.
Claim Language
13. The system of claim 12 wherein the first or
second gesture signal comprises a vector signal.
15. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter from a difference
between the first gesture signal and the second
gesture signal.
16. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and a physical
model.
17. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and an
animation.
18.a. The system of claim 12 wherein the drive
module comprises a drive module for receiving a
first device sensor signal,
18.b. receiving a second device sensor signal, and
18.c. generating a dynamic interaction parameter
using the first gesture signal and the second gesture
51

Rosenberg 373
See claim 2.
See claim 4.
Also see discussion of drive
module in claim 12.b.
See claim 5.
Also see discussion of drive
module in claim 12.b.
See claim 6.
Also see discussion of drive
module in claim 12.b.
See limitation 7.a.
Also see discussion of drive
module in claim 12.b.
See limitation 7.b.
See limitation 7.c.

signal and the first device sensor signal and the


second device sensor signal.

Claim Language
23.pre. A nontransitory computer
readable medium
having instructions
stored thereon that,
when executed by a
processor, causes
the processor to
produce a haptic
effect, the
instructions
comprising:

Rosenberg 373
Rosenberg Rosenberg 373 discloses host computer
system 12 which preferably includes a host
microprocessor 16, random access memory (RAM) 17, readonly memory (ROM) 19. Ex. 1004 at 6:47-51; see also
6:55-57 (the microprocessor preferably retrieves and stores
instructions in RAM 17 and ROM 19).
See limitation 1.pre for discussion of method for producing
a haptic effect.
See also Ex. 1002, 214-217.

ROM memory is a non-transitory computer readable medium, which may


include instructions executable by microprocessor 16. Ex. 1002, 217.
Claim Language
Rosenberg 373
23.a. receiving a first gesture signal;
See limitation 1.a.
23.b. receiving a second gesture signal;
See limitation 1.b.
23.c. generating a dynamic interaction parameter
See limitation 1.c.
using the first gesture signal and the second gesture
signal; and
23.d. applying a drive signal to a haptic output
See limitation 1.d.
device according to the dynamic interaction
parameter.
24. The non-transitory computer readable medium of See claim 2.
claim 23, wherein the first or second gesture signal
comprises a vector signal.
26. The non-transitory computer readable medium of See claim 4.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter from a difference between the
first gesture signal and the second gesture signal.
27. The non-transitory computer readable medium of See claim 5.
52

claim 23, wherein generating a dynamic interaction


parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and a physical model.
28. The non-transitory computer readable medium of
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and an animation.
29.a. The non-transitory computer readable medium
of claim 23, further comprising: receiving a first
device sensor signal;
29.b. receiving a second device sensor signal; and
29.c. wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and the first device
sensor signal and the second device sensor signal.
J.

See claim 6.

See limitation 7.a.

See limitation 7.b.


See limitation 7.c.

Ground 3: Claims 3, 14 and 25 are Obvious Under 35 U.S.C.


103(a) (pre-AIA) in Light of Rosenberg 373 and Rosenberg 846.

Claims 3, 14 and 25 are rendered obvious by Rosenberg 373 in view of U.S.


Patent No. 6,429,846 to Rosenberg et al. (Rosenberg 846). Rosenberg 846 is
another patent assigned to Patent Owner Immersion, and shares the same lead
inventor as Rosenberg 373, Louis B. Rosenberg. Ex. 1004 at cover; Ex. 1006 at
cover. Rosenberg 846 is prior art to the 571 patent under 35 U.S.C. 102(a)
(pre-AIA) because it issued on Aug. 6, 2002, before the earliest possible priority
date of the 571 patent (Aug. 23, 2012). Ex. 1006 at cover.
Rosenberg 846 discloses a system for provid[ing] haptic feedback to a
planar touch control device of a computer, such as a touchpad or a touch screen.

53

Id. at 2:54-56. The touch control includes a touch input device operative to
input a position signal to a processor of said computer based on a location of user
contact on the touch surface. Id. at 2:7-12. Rosenberg 846 further discloses
providing haptic effects at least in part based on the location of the finger on the
pad or dependent upon the current velocity of the users finger (or other object)
on the touchpad. Id. at 5:14-16; 11:59-62.
In addition to the specific proposed combinations of Rosenberg 373 and
Rosenberg 846 discussed below, it generally would have been obvious to a
POSITA to combine the references, at least because (a) each is assigned to the
same assignee, Immersion (Ex. 1004 at cover; Ex. 1006 at cover.), (b) each has the
same lead inventor, Louis B. Rosenberg (id.), (c) each focuses on methods of
providing haptic feedback in an electronic device (Ex. 1004 at 3:25-31; Ex. 1006 at
1:18-22.); (d) each discloses a user interface device that provides an indication of
position and/or velocity (Ex. 1004 at 3:44-47; 15:50-53; Ex. 1006 at 6:26-29; 4:4346.); and (e) each discloses generating a haptic effect that varies based upon
changes in position and/or velocity (Ex. 1004 at 17:2-17; Ex. 1006 at 5:14-16;
11:59-62.). Ex. 1002, 231.
As discussed below, claims 3, 14 and 25 are obvious in view of Rosenberg
373 and Rosenberg 846.
Claim Language
3. The method of

Rosenberg 846
Rosenberg 846 discloses that the present invention relates
54

claim 1 wherein the


first or second
gesture signal
comprises an onscreen signal.

to a haptic feedback touch control for inputting signals to a


computer and for outputting forces to a user of the touch
control. Ex. 1006 at 2:6-8.
Rosenberg 846 further discloses that the touch input
device can be included in a display screen of the
computer as a touch screen The user contacts the touch
surface with a finger, a stylus, or other object. Id. at 2:2226.
Rosenberg 846 further discloses that the touch control
includes a touch input device including an approximately
planar touch surface operative to input a position signal to a
processor of said computer based on a location of
user contact on the touch surface. Id. at 2:7-12.
Rosenberg 846 further discloses presenting a vibration to a
user, the vibration being dependent upon the current velocity
of the user's finger (or other object) on the touchpad. Id. at
11:59-62.
See also Ex. 1002, 233-238.

A POSITA would understand that signals representing the users finger on


the touchscreen (first or second gesture signals) are on-screen signals, because the
signals represent one or more touches on the touchscreen. Ex. 1002, 237.
It would be obvious to a POSITA to combine the touch screen disclosed by
Rosenberg 876 with the system described by Rosenberg 373. Id., 238.
Rosenberg 373 discloses that its system can be implemented through the user of
a human-computer interface device, such as a stylus and tablet. Ex. 1004 at
1:36-41. A POSITA would recognize that a stylus and tablet may provide the
same types of input signals as the touchscreen described in Rosenberg 876. Ex.
55

1002, 238. Indeed, Rosenberg 876 likewise discloses that the touch input
device can be a touchpad or touch screen, and that the user can contact the
touch surface with a finger, a stylus, or other object. Ex. 1006 at 2:22-26.
Motivation to do so arises at least from the fact that both patents describe user
interface devices having a planar touch sensitive surface. Ex. 1004 at 1:36-41; Ex.
1006 at 2:22-26; Ex. 1002, 238. Additional motivation to do so arises from the
fact that both patents disclose user interface devices capable of generating signals
representing position. Ex. 1004 at 3:44-47; Ex. 1006 at 6:26-29. Thus, a POSITA
would recognize that the Rosenberg 876 touchscreen could be used as the user
interface device (user object 34) in the Rosenberg 373 system in the manner
described above in connection with claim 1. Ex. 1002, 238.
Claim Language
Rosenberg 846
14. The system of claim 12 wherein the first or
See claim 3.
second gesture signal comprises an on-screen signal.
25. The non-transitory computer readable medium
See claim 3.
of claim 23, wherein the first or second gesture
signal comprises an on-screen signal.
IV.

THE GROUNDS OF INVALIDITY ARE NOT REDUNDANT


The grounds of invalidity raised herein are not redundant over one another

because grounds 1 based on Burrough and grounds 203 based upon Rosenberg 373
disclose the claims differently. For example, Burrough discloses gestures using a
multi-touch touchscreen, while Rosenberg 373 discloses gestures on other types of
user interface devices, such as joysticks. Burrough further discloses generating a
56

dynamic interaction parameter using gesture signals representing the position of


two different fingers simultaneously touching the touch screen, while Rosenberg
373 discloses generating a dynamic interaction parameter using gesture signals
representing the position of the user interface device at two points in time.
Moreover, the limited grounds presented in this petition do not impede the
just, speedy, and inexpensive resolution of [this] proceeding, as required by 37
C.F.R. 42.1(b). Because Petitioner has limited its petition to just two primary
prior art references and three grounds, Petitioner respectfully requests that the
Board institute trial on all grounds presented in this Petition in order to avoid
prejudicing Petitioner. If the Board nonetheless institutes fewer than the limited
number of presented grounds, Petitioner respectfully requests that the Board
institute at least Ground 1.
V.

CONCLUSION
For the foregoing reasons, Petitioner requests that the Board institute trial

and cancel claims 1-7, 12-18, and 23-29 of the 571 patent.

Dated: July 7, 2016

Respectfully Submitted,
/James M. Heintz 41828/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
57

Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.

58

CERTIFICATION UNDER 37 CFR 42.24(d)


Under the provisions of 37 CFR 42.24(d), the undersigned hereby certifies
that the word count for the foregoing Petition for Inter Partes Review totals
13,875, as calculated by Microsoft Word, which is less than the 14,000 allowed
under 37 CFR 42.24(a)(i).

Dated: July 7, 2016

Respectfully Submitted,
/James M. Heintz/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.

59

CERTIFICATE OF SERVICE
The undersigned hereby certifies that a copy of the foregoing Petition for
Inter Partes Review and all Exhibits and other documents filed together with the
petition were served on July 7, 2016, via United Parcel Service, directed to the
attorneys of record for the patent at the following address:
Immersion Corporation
50 Rio Robles
San Jose, CA 95134

Dated: July 7, 2016

Respectfully Submitted,
/James M. Heintz/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.

60

You might also like