Professional Documents
Culture Documents
APPLE INC.,
Petitioner
v.
IMMERSION CORPORATION,
Patent Owner
U.S. Patent No. 8,659,571
Filing Date: February 21, 2013
Issue Date: February 25, 2014
Title: Interactivity Model for Shared Feedback on Mobile Devices
TABLE OF CONTENTS
I.
2.
3.
4.
Service Information....................................................................2
B.
C.
D.
Standing ................................................................................................3
E.
Fees .......................................................................................................3
II.
III.
B.
C.
D.
E.
F.
G.
2.
3.
vector signal............................................................................9
4.
5.
6.
7.
module ..................................................................................11
i
Table of Contents
(continued)
Page
H.
I.
Ground 2: Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29
are Obvious Under 35 U.S.C. 103(a) (pre-AIA) in Light of
Rosenberg 373...................................................................................32
J.
IV.
V.
CONCLUSION.............................................................................................57
ii
EXHIBIT LIST
Exhibit No.
Description
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
iii
Apple Inc. (Apple or Petitioner) hereby petitions for inter partes review
of U.S. Patent No. 8,659,571 (the 571 patent). Ex. 1001. The571 patent
generally relates to providing dynamic haptic feedback in response to signals
representing user gestures on a user interface device, such as a touchscreen or
joystick. The claims of the 571 patent challenged in this Petition are invalid in
view of Apples patent, U.S. Patent Application No. 2010/0156818 to Burrough et
al. (Burrough). Most of the challenged claims also are invalid in view Patent
Owner Immersions (Patent Owner or Immersion) earlier patents, U.S. Patent
No. 5,734,373 to Rosenberg et al. (Rosenberg 373), which issued over a decade
before the 571 patent was filed. The remaining claims directed to on-screen
gesture signals are invalid in view of Rosenberg 373 in combination with another
of Immersions patents from the same lead inventor, U.S. Patent No. 6,429,846 to
Rosenberg et al. (Rosenberg 846).
I.
Real Party-In-Interest
Related Matters
Lead counsel is James M. Heintz, Reg. No. 41,828, of DLA Piper LLP (US),
11911 Freedom Drive, Suite 300; Reston, VA 20190; Apple-ImmersionIPRs@dlapiper.com, 703-773-4148 (phone), 703-773-5200 (fax). Backup counsel
is Robert Buergi, Reg. No. 58,125, of DLA Piper LLP (US); 2000 University Ave;
East Palo Alto, CA 94330; robert.buergi@dlapiper.com, 650-833-2407 (phone),
650-687-1144 (fax).
4.
Service Information
Power of Attorney
D.
Standing
Fees
claims 1-7, 12-18, and 23-29 of the 571 patent in view of the following grounds:
A.
Claims 1-7, 12-18, and 23-29 are obvious under 35 U.S.C. 103(a)
Claims 1, 2, 4-7, 12, 13, 15-18, and 23, 24, 26-29 are obvious under
in light of Rosenberg 373 and U.S. Patent No. 6,429,846 to Rosenberg et al.
(Rosenberg 846).
3
III.
Technology Background
for an improved system of providing a dynamic haptic effect that includes multiple
gesture signals and device sensor signals. Id. at 1:58-60. To solve these
problems, the 571 patent discloses a system for providing dynamic haptic
effects based upon gesture signals and/or device sensor signals. Id. at 1:66-2:5. A
dynamic haptic effect refers to a haptic effect that evolves over time as it responds
to one or more input parameters. Id. at 2:65-67.
C.
Sherman. Ex. 1001 at cover. However, Immersion did not inform Mr. Sherman
that identical claims had been rejected in the 698 application. See generally Ex.
1003. Because the claims challenged in this Petition had been found to be
anticipated or obvious in view of Birnbaum, they should never have been granted.
And, although this Petition does not rely upon Birnbaum, this Petition
demonstrates that the challenged claims are also obvious in view of three other
references, Burrough, Rosenberg 373 and Rosenberg 864.
D.
A person of ordinary skill in the art (POSITA) at the time of the alleged
invention of the 571 patent would have had a Bachelors degree in computer
science, electrical engineering, or a comparable field of study, plus approximately
two to three years of professional experience with software engineering, haptics
programming, or other relevant industry experience. Additional graduate
education could substitute for professional experience and significant experience in
the field could substitute for formal education. Ex. 1002, 38.
E.
In the ITC investigation, Immersion has alleged that claims 1-7, 12-18, and
23-29 of the 571 patent are practiced by certain Apple iPhone products. Ex. 1011.
To support these allegations, Immersion provided claim charts purporting to show
how Apples iPhone 6s and 6s Plus products allegedly practice these claims. Id.
F.
Claim Construction
showing how Immersion believes that the 571 patents claims allegedly
encompass certain of Petitioner Apples products, as described above. Exs. 1011,
1012. For the purposes of this proceeding, Petitioner respectfully requests that
Immersion be held to constructions at least as broad as those set forth by
Immersion in these claim charts and in its proposed claim constructions as
discussed below.
1.
gesture signal
The term gesture signal (claims 1-7, 12-18, 23-29) should be broadly
construed to encompass a signal indicating user interaction with a user interface
device. The 571 patent describes a gesture as any movement of the body that
conveys meaning or user intent. Ex. 1001 at 3:35-36. The patent further
describes a gesture as any form of hand movement recognized by a device
having an accelerometer, gyroscope, or other motion sensor, and converted to
electronic signals. Id. at 3:56-59. The specification describes various exemplary
user interface devices that produce gesture signals, including a touch sensitive
surface, or any other type of user interface such as a mouse, touchpad,
minijoystick, scroll wheel, trackball, game pads or game controllers. Id. at 4:5963. Thus, in the context of the specification, a gesture signal is described as a
signal indicating user interaction with a user interface device. Petitioner submits
that the BRI of gesture signal should encompass these descriptions.
Immersion may argue that a gesture signal has a special meaning limited
to signals resulting from the interaction of fingers on touchscreens when
performing finger movements such as swipes. However, this argument must be
rejected because limiting gesture signal in this manner is contrary to the broad
definition of gesture discussed above, and reads out embodiments involving
systems that do not include touchscreens but instead use devices such as
minijoysticks, mouses, and trackballs as user input devices as discussed at 4:59-63.
2.
vector signal
on screen signal
animation
Based on Immersions public contentions, Immersion should be held to a
construction of this limitation (claims 6, 17, 28) that encompasses generating a
dynamic interaction parameter that is coordinated with an animation. For example,
Immersion contends that the accused Apple products generate a dynamic
interaction parameter corresponding to the amount of pressure exerted on the touch
screen. Ex. 1011 at 38-39. Immersion contends that the first gesture signal is
received when a user presses lightly on, e.g., an email (referred to as a Peek
gesture) and the second gesture signal is received when a user presses deeply to
pop into the email (referred to as a Pop gesture). Id. at 4-6, 22-24.
Immersion further contends that the use of animations relevant to Peek and Pop
satisfies this limitation. Ex. 1011 at 83.
7.
module
11
Claims 1-7, 12-18, and 23-29 are rendered obvious by U.S. Patent
Application No. 2010/0156818 to Burrough et al. (Burrough), assigned to
Petitioner Apple. Burrough was published on Jun. 24, 2010, more than one year
before the earliest possible priority date of the 571 patent (Aug. 23, 2012), and is
therefore prior art to the 571 patent under 35 U.S.C. 102 (b) (pre-AIA). Ex.
1005 at cover.
Burrough discloses providing multi-touch haptic feedback on a device
with a multi-touch touch based input device, such as touch screen. Id. at [0010];
[0017]. An example of such a device is shown in Fig. 1B:
12
Id. at Fig. 1B. Burrough discloses that the touch screen can recognize at least two
substantially simultaneously occurring gestures using at least two different fingers
or other object. Id. at [0035]. Such gestures include gestures associated with
zooming, panning, scrolling, rotating, enlarging and/or the like. Id. at [0017].
Burrough further discloses providing dynamic haptic feedback in response to
gestures on the touch screen. Id. at [0051]. For example, vibrations can be
adjusted based on a change in touch characteristics (i.e. speed, direction location,
etc.). Id. at [0051].
In one embodiment, Burrough discloses a multi-touch zoom gesture, in
which an image can be zoomed in or out by moving two fingers apart or together,
13
respectively. Id. at [0080]; Fig. 11. Burrough discloses that the amount of
zooming and the associated haptic effect varies according to the distance between
the two [fingers]. Id. at [0081]. For example, the haptic effect can be, for
example, faster (or slower) or more intense (or less intense) vibration as the
distance between the two fingers increases. Id. at [0080].
As illustrated in Figs. 12C and 12D, the haptic effect associated with a
multi-touch zoom gesture can be a function of the distance between the two
fingers1:
Id. at Figs. 12C, 12D; [0082]. In these figures, the magnitude of the haptic
response H(d) at each finger is denoted by the size of the circle for each response.
In this case, as the distance between the two fingers increases, the haptic effect H
1
Burrough
Burrough discloses that the invention relates, in one
embodiment, to an apparatus and method for providing
multi-touch haptic feedback. Ex. 1005 at [0010]; [0003].
Burrough further discloses that the described embodiments
generally pertain to gestures and methods of implementing
gestures with associated physical feedback with touch
sensitive devices. Id. at [0035].
See also Ex. 1002, 55-56.
1.b. receiving a
second gesture
signal;
Burrough
Burrough discloses that [o]ne of the advantages of the
invention lies in the fact that the relationship between a
touch event or a class of touch events and corresponding
haptic response can be dynamic in nature For example,
vibrations can be adjusted based on a change in touch
characteristics (i.e., speed, direction, location, etc.). Ex.
1005 at [0051].
As one example, Burrough discloses a zoom gesture
method 1100, in which the distance between at least the
two fingers is compared If the distance between the two
fingers increases (spread apart) at 1110, a zoom-in signal is
generated at 1112, otherwise a zoom out signal is generated
at block 1114. The zoom-in signal, in turn, causes the
haptic devices associated with the two fingers to provide a
zoom-in haptic signal at 1116. Such a zoom in haptic signal
can be, for example, faster (or slower) or more intense (or
less intense) vibration as the distance between the two
16
increases, and decreases as the distance decreases. Ex. 1002, 68. The function
that defines the relationship between the haptic effect and the distance is referred to
as a haptic profile. Ex. 1005 at [0082]. Burrough discloses generating a haptic
response H(d) by applying the haptic profile corresponding to each finger to the
distance d between the fingers. Ex. 1002, 68.
Burrough further describes an embodiment in which the haptic profile
defining the haptic effect for each finger itself varies as a function of the zoom
factor, for example, by increasing the slope as the resolution of the underlying map
increases. Id. at [0082]. In other words, the rate at which the magnitude of the
haptic effect may change in response to a change in distance between the two
fingers can increase as the resolution of the map increases. Ex. 1002, 69.
The haptic response H(d) is a dynamic interaction parameter under
Immersions interpretation of that claim term, as discussed above in Section
III.G.2, because it changes over time or reacts in real time based upon the users
interaction with the touchscreen. Specifically, as the users fingers move apart, the
distance between the fingers increases, and the haptic response likewise increases
as a function of this distance. Similarly, as users fingers move together, the
distance between the fingers decreases, and the haptic response likewise decreases
as a function of this distance. Ex. 1002, 70.
Claim Language
1.d. applying a
Burrough
Burrough discloses that the touch sensitive input device
18
drive signal to a
haptic output device
according to the
dynamic interaction
parameter.
Burrough
Burrough discloses that vibrations can be adjusted based on
a change in touch characteristics (i.e., speed, direction,
location, etc.). Ex. 1005 at [0051].
20
gesture signal
comprises a vector
signal.
Burrough
Burrough discloses that the touch sensitive surface can be a
touch screen, and the GUI object can be displayed on the
touch screen As a result, when the fingers are moved
apart, the zoom-in signal can be used to increase the size of
the embedded features in the GUI object and when the
fingers are pinched together, the zoom-out signal can be
used to decrease the size of embedded features in the
object. Ex. 1005 at [0081]; see also Figs. 12A-H.
See also Ex. 1002, 79-80.
21
A POSITA would understand that the signals representing the users fingers
(first and second gesture signals) are on-screen signals, because the signals
represent one or more touches on the touch screen. Ex. 1002, 80.
Claim Language
4. The method of
claim 1 wherein
generating a dynamic
interaction parameter
comprises generating
a dynamic interaction
parameter from a
difference between
the first gesture signal
and the second
gesture signal.
Burrough
Burrough discloses that [f]ollowing block 1106, the zoom
gesture method 1100 proceeds to block 1108 where the
distance between at least the two fingers is compared. The
distance may be from finger to finger or from each finger
to some other reference point as for example the centroid.
Ex. 1005 at [0080].
Burrough further discloses For instance, as the fingers
spread apart or closes together, the object zooms in or
zooms out at the same time and the corresponding haptic
effect will change. Id. at [0081].
See also Ex. 1002, 81-84.
A POSITA would appreciate that the distance between the users fingers is
calculated by a difference between the two position signals (gesture signals). Ex.
1002, 83. And, as established in connection with limitation 1.c, the haptic
response H(d) (dynamic interaction parameter) is generated as a function of this
distance.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation. It
would have been obvious to a POSITA that the distance between the users fingers
can be calculated by taking the difference between the position of the fingers. Ex.
1002, 84. Motivation to do so comes from, for example, Burroughs disclosure
22
that touch signals may indicate the location of the finger on the touch sensitive
screen. Ex. 1005 at [0042].
Claim Language
5. The method of
claim 1 wherein
generating a dynamic
interaction parameter
comprises generating a
dynamic interaction
parameter using the
first gesture signal and
the second gesture
signal and a physical
model.
Burrough
Burrough discloses that multi-touch devices monitor a
sensing surface for a touch or near touch, and when a
touch occurs determines the distinct areas of contact and
identifies the contacts via their geometric features and
geometric arrangement. Once identified or classified, the
contacts are monitored for various motions, actions or
events. Ex. 1005 at [0005]; see also [0054] (describing
placement of sensing regions based upon size of a hand);
[0063]-[0070] (describing equation modeling pressure of
finger on the touchscreen).
See also Ex. 1002, 85-87.
23
determine the center location of each finger when calculating the distance between
two fingers. Ex. 1002, 87. Motivation to do so comes from, for example,
Burroughs disclosure that the location of the users fingers will associate or lock
the fingers to a particular GUI object being displayed, which object is zoomed in
or out based on the movement of the users fingers. Ex. 1005 at [0081]; Ex. 1002,
87.
Claim Language
6. The method of claim 1
wherein generating a
dynamic interaction
parameter comprises
generating a dynamic
interaction parameter
using the first gesture
signal and the second
gesture signal and an
animation.
Burrough
Burrough discloses that vibrations can be mapped to
animation effects occurring on display 112 (rubber
band, bounce etc.) Ex. 1005 at [0051].
Burrough also discloses that zooming typically can
occur substantially simultaneously with the motion of
the objects. For instance, as the fingers spread apart or
closes together, the object zooms in or zooms out at the
same time and the corresponding haptic effect will
change. Id. at [0081].
See also Ex. 1002, 88-91.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Burroughs disclosures. For example, it would have been obvious to a
POSITA to use an animation, such as a zoom in or zoom out animation, to generate
the haptic response (dynamic interaction parameter), for example, to create haptic
effects coordinated with the displayed animation. Ex. 1002, 91. Motivation to
do so comes from, for example, Burroughs disclosure that as the object zooms in
or zooms out at the same time and the corresponding haptic effect will change.
Id.; Ex. 1005 at [0081].
Claim Language
7. The method of
claim 1 further
comprising:
receiving a first
device sensor
signal;
Burrough
Burrough discloses that the touch sensitive surface is
arranged to receive different types of user touch events each
being characterized by an amount of pressure applied on the
touch sensitive surface by a user. Ex. 1005 at [0016].
Burrough further discloses that haptic device 300 can be
used as a pressure sensor simply by sensing a voltage Vp
generated by the displacement dY of member 306 caused by
force F applied to the surface of surface 126. In this way, by
monitoring voltage Vp, haptic device 300 can be configured
to act as an integrated haptic actuator/pressure sensor
arranged to change operational modes (passive to active, and
vice versa). Id. at [0070]; see also Fig. 4.
Burrough further discloses that in the zoom gesture method
1100, the nature of the multi-touch event can be
determined based upon either the presence of at least two
fingers indicating that the touch is gestural (i.e. multi-touch)
rather than a tracking touch based on one finger and/or by
the pressure asserted by the fingers on the surface 126. The
pressure asserted by the fingers on the touch screen can be
determined by monitoring the voltage Vp described above.
25
Id. at [0079].
See also Ex. 1002, 92-94.
receiving a second
device sensor
signal; and
Thus, Burrough discloses that haptic actuator 300 can detect the pressure
applied by each finger on a touchscreen by monitoring a voltage Vp (first and
second device sensor signals). Ex. 1002, 96.
Claim Language
wherein generating
a dynamic
interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and
the first device
sensor signal and
the second device
sensor signal.
Burrough
Burrough discloses that pressure information can be linked
with haptic feedback. For example, vibro-tactile sensation
can be increased with increasing pressure, and vice versa.
Accordingly, when a user exerts increased pressure (i.e.,
presses harder) on a surface, the amount of vibration felt by
the user increases thereby informing the user that they are
pressing harder. Ex. 1005 at [0071]
Burrough further discloses that in the zoom gesture
embodiment [i]f it is determined at block 1104 that the
presence of the two fingers represents a gesture, then the
haptic devices nearest the touch point are set to active mode
in order to provide a vibro-tactile response at 1106 to each
of the fingers during the gesture. In the described
embodiment, the vibro-tactile response provided to each
finger can have the same profile or different profiles. For
example, if it the pressure applied by one finger is
substantially greater than that applied by the other finger,
then the vibro-tactile response for the two fingers can be
different due to the varying pressure applied by each
finger. Id. at [0079].
See also Ex. 1002, 97-101.
profile that varies the haptic effect as a function of the distance between fingers
(calculated from the first and second gesture signals). Burrough further discloses
that the two fingers can have different profiles based upon the pressure applied by
each finger (first and second device sensor signals). Ex. 1005 at [0079]. A
POSITA would therefore understand that the haptic response H(d) for each finger
(dynamic interaction parameters) is generated using first and second gesture
signals (representing positions of fingers) and first and second device sensor
signals (representing pressures applied by fingers). Ex. 1002, 99.
If the Board finds that Burrough does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Burroughs disclosures. For example, it would have been obvious to a
POSITA to generate the haptic response (dynamic interaction parameter) based
upon both the distance between the users fingers in a zoom gesture and based
upon the pressure applied by each finger, for example to create a haptic effect for
each finger that varies both as a function of distance and as a function of pressure.
Ex. 1002, 100. Motivation to do comes from, for example, Burroughs
disclosures that the haptic effect for each finger in a zoom gesture increases
linearly with distance d based on a haptic profile for the finger (Ex. 1005 at
[0082]) and that the haptic response provided to each finger can have different
profiles (id. at [0079]). Ex. 1002, 101.
27
Claim Language
12.pre. A haptic
effect enabled
system comprising:
Burrough
Burrough discloses that the invention relates, in one
embodiment, to an apparatus and method for providing
multi-touch haptic feedback. Ex. 1005 at [0010]; [0003];
see also Fig. 1.
Burrough further discloses that the described embodiments
generally pertain to gestures and methods of implementing
gestures with associated physical feedback with touch
sensitive devices. Id. at [0035].
12.a. a haptic
output device;
12.b. a drive
module
electronically
coupled to the
haptic output device
for receiving a first
gesture signal,
receiving a second
gesture signal, and
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal; and
Burrough
Burrough discloses a micro-controller 132 (drive circuit)
which can use touch information Tinfo to query haptic data
base 134 that includes a number of predetermined haptic
profiles each of which describes a specific haptic response
H, in terms of duration of response, type of vibro-tactile
response, strength of response, etc. In this way, the
response of haptic actuator 136 can be controlled in real
time by microprocessor 132 by establishing the duration,
strength, type of vibro-tactile response Hx. Ex. 1005 at
[0047].
Also see discussion of applying a drive signal to the haptic
output device according to the dynamic interaction in claim
1.a, 1.b and 1.d.
See also Ex. 1002, 109-111.
Burrough
See claim 2.
See claim 3.
See claim 4.
Also see discussion of drive
module in claim 12.b.
Claim Language
23.pre. A nontransitory computer
readable medium
having instructions
stored thereon that,
when executed by a
processor, causes
the processor to
produce a haptic
effect, the
instructions
comprising:
See claim 5.
Also see discussion of drive
module in claim 12.b.
See claim 6.
Also see discussion of drive
module in claim 12.b.
See limitation 7.a.
Also see discussion of drive
module in claim 12.b.
See limitation 7.b.
See limitation 7.c.
Burrough
Burrough discloses that the invention is preferably
implemented by hardware, software or a combination of
hardware and software. The software can also be embodied
as computer readable code on a computer readable
medium. Ex. 1005 at [0085].
Burrough further discloses that the computer code and data
can reside within a memory 108 that can be operatively
coupled to processor 106. By way of example, memory 108
can include Read-Only Memory (ROM), Random-Access
Memory (RAM), flash memory, hard disk drive and/or the
like. Id. at [0038].
Also see discussion of producing haptic effects in limitation
1.pre.
See also Ex. 1002, 120-123.
30
A POSITA would appreciate that memory, such as, e.g. ROM, flash memory
and hard disk drives, each comprise non-transitory computer readable media, and
that computer code stored on the computer readable media comprise instructions
executable by a processor. Ex. 1002, 123.
Claim Language
Burrough
23.a. receiving a first gesture signal;
See limitation 1.a.
23.b. receiving a second gesture signal;
See limitation 1.b.
23.c. generating a dynamic interaction parameter
See limitation 1.c.
using the first gesture signal and the second gesture
signal; and
23.d. applying a drive signal to a haptic output
See limitation 1.d.
device according to the dynamic interaction
parameter.
24. The non-transitory computer readable medium of See claim 2.
claim 23, wherein the first or second gesture signal
comprises a vector signal.
25. The non-transitory computer readable medium of See claim 3.
claim 23, wherein the first or second gesture signal
comprises an on-screen signal.
26. The non-transitory computer readable medium of See claim 4.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter from a difference between the
first gesture signal and the second gesture signal.
27. The non-transitory computer readable medium of See claim 5.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and a physical model.
28. The non-transitory computer readable medium of See claim 6.
claim 23, wherein generating a dynamic interaction
parameter comprises generating a dynamic
interaction parameter using the first gesture signal
and the second gesture signal and an animation.
29.a. The non-transitory computer readable medium See limitation 7.a.
31
Ground 2: Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29 are
Obvious Under 35 U.S.C. 103(a) (pre-AIA) in Light of
Rosenberg 373.
Claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29 are rendered obvious by
U.S. Patent No. 5,734,373 to Rosenberg (Rosenberg373), which is an
Immersion patent dating to 1998. Rosenberg 373 is prior art to the 571 patent
under 35 U.S.C. 102(b) (pre-AIA) because it issued on March 31, 1998, more
than one year before the earliest possible priority date of the 571 patent (Aug. 23,
2012). Ex. 1004 at cover.
Rosenberg 373 discloses a system directed to controlling and providing
force feedback to a user operating a human/computer interface device, such as a
joystick, mouse, simulated medical instrument, stylus, or other object. Ex. 1004
at 3:25-27; 3:50-53. An example of such a system is shown in Fig. 1:
32
Ex. 1004 at Fig. 1. As illustrated in Fig. 1, user 22 can manipulate and move the
user interface device (user object 34) to interface with the host application
program the user is viewing on display screen 20. Id. at 13:50-53. Sensors 28
sense the position, motion and other characteristics of the user interface device,
and generate sensor data which can include position values, velocity values,
and/or acceleration values in one or more degrees of freedom. Id. at 10:10-14;
33
15:50-60.
Rosenberg 373 further discloses a reflex process or force sensation
process for providing force feedback to the user through the user interface device.
Id. at 17:2-11; 4:50-56. The force feedback can be based on parameters, such as
the received sensor data and timing data. Id. at 17:6-11. Rosenberg 373 discloses
various algorithms for calculating a force value representing the force feedback
to be provided to the user. Id. at 17:6-21. For example, the force value can vary
linearly or nonlinearly with the position, velocity or acceleration of the user object.
Id. at 17:11-21. The force value may be provided to microprocessor 26, which
converts the force value into an appropriate form usable by haptic actuators 30,
which transmit forces to user object 34 of the interface device 14 in response
to signals received from microprocessor 26. Id. at 21:65-66; 11:54-57. Thus,
Rosenberg 373 discloses generating a dynamic interaction parameter based on
signals corresponding to a users gestures and using it to generate dynamic haptic
effects.
As discussed below, Rosenberg 373 discloses and/or renders obvious all of
the limitations of challenged claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29.
Claim Language
1.pre. A method of
producing a haptic
effect comprising:
Rosenberg 373
Rosenberg 373 discloses that the present invention relates
generally to interface devices between humans and
computers, and more particularly to computer interface
devices that provide force feedback to the user. Ex. 1004
at 1:22-25; see also 3:25-27.
34
1.b. receiving a
second gesture
signal;
35
signals from sensors 28, as the user manipulates user object 34, which is a user
interface device such as a joystick and thus corresponds to the user interface
described in the 571 patent . Ex. 1004 at 15:46-50; Ex. 1002, 150. A POSITA
would understand that sensor data from sensors 28 are gesture signals, because
they indicate user interaction with user object 34. Id.
Because the gesture signals are continually received from sensors 28 as the
user manipulates the user object 34, multiple (i.e. at least first and second) gesture
signals are received from any single sensor 28 by local microprocessor 26. Id.
Rosenberg 373 further explains that the raw data representing these gesture
signals can also be received by host computer 12. Ex. 1004 at 15:46-50.
Claim Language
1.c. generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal; and
Rosenberg 373
Rosenberg 373 discloses that [t]he sensor data read in step
78 informs the host computer 12 how the user is interacting
with the application program. From the position of object
34 sensed over time, the host computer system 12 can
determine when forces should be applied to the object. Ex.
1004 at 16:21-25.
Rosenberg 373 discloses a reflex process or force
sensation process for providing force commands
dependent on other parameters, such as sensor data. Id. at
17:2-5; see also 4:50-56.
Rosenberg 373 discloses that force sensation processes
can include a force algorithm to calculate a force
value based on sensor and timing data. Id. at 17:6-11.
For example, Rosenberg 373 discloses [a]lgorithms in
which force varies linearly (or nonlinearly) with the velocity
of object 34 and [a]lgorithms in which force varies
36
Rosenberg 373
Rosenberg 373 discloses force feedback interface device
14 that includes actuator 30 (haptic output device). Ex.
1004 at Figs. 1, 3.
Rosenberg 373 further discloses that [a]ctuators 30
transmit forces to user object 34 of the interface device 14 in
one or more directions alone one or more degrees of
freedom in response to signals received from
microprocessor 26. Id. at 11:54-57; see also 11:57-12:24.
Rosenberg 373 discloses that a low-level force command
determined in step 82 is output to microprocessor 26 over
bus 24. This force command typically includes a force value
that was determined in accordance with the parameters
described above. Id. at 19:64:20:1.
Rosenberg 373 discloses that the force command can be
38
39
1004 at 12:31-38. Microprocessor 26 may send signals based upon the received
force value to an actuator interface 38, which converts signals from the
microprocessor into signals appropriate to drive actuators 30 (drive signal). Id.;
Ex. 1002, 164. In this embodiment, actuator interface 38 applies a drive signal to
the haptic actuators (haptic output device) to generate force feedback based upon
the force value (dynamic interaction parameter). Id.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on Rosenbergs disclosures above. It would have been obvious to a POSITA that a
drive signal could be applied by the microprocessor 26 or actuator interface 38 to
the haptic actuators according to the force value (dynamic interaction parameter)
calculated by the host computer, for example, to generate the force feedback
specified by the force value. Ex. 1002, 165. Motivation to do so comes from, for
example, Rosenbergs disclosure of output[ing] a force on the user object by
sending the computed force value to the actuators. Id.; Ex. 1004 at 4:50-56.
Claim Language
2. The method of
claim 1 wherein the
first or second
gesture signal
comprises a vector
signal.
Rosenberg 373
Rosenberg 373 discloses that the sensor data read by host
computer 12 in step 78 can include position data, velocity
data, and acceleration data. Ex. 1004 at 17:30-35.
Rosenberg 373 discloses that velocity sensors and/or
accelerometers can be used to directly measure velocities
and accelerations on object 34. Id. at 11:19-24.
Rosenberg 373 further discloses an embodiment where, if
40
both a speed and a direction (positive or negative) along that degree of freedom.
Ex. 1002, 170. Similarly, a POSITA would understand that a signal from an
acceleration sensor representing acceleration of user object 34 is a vector signal
because it includes both a magnitude and a direction of the measured acceleration.
Id.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation based
on Rosenbergs disclosures above. It would have been obvious to a POSITA that a
force value could be generated based upon a sensed position and a sensed velocity,
for example, to implement the car race embodiment described above. Ex. 1002,
171. Motivation to do so comes from, for example, Rosenbergs disclosure of an
embodiment in which force feedback varies based upon both the position and
velocity of the user object, and Rosenbergs disclosure of velocity sensors that can
sense the velocity of the user object. Id.; Ex. 1004 at 16:28-34; 11:19-24.
Claim Language
4. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter from a
difference between
the first gesture
Rosenberg 373
Rosenberg 373 discloses that [i]n an alternate
embodiment, the sensor data read in step 78 includes
position data and no velocity or acceleration data, so that
host computer 12 is required to calculate the velocity and
acceleration from the position data. This can be
accomplished by recording a number of past position values,
recording the time when each such position value was
received using the system clock 18, and calculating a
velocity and/or acceleration from such data. Ex. 1004 at
17:37-45.
42
Thus, Rosenberg 373 discloses that the force value (dynamic interaction
parameter) can be generated based on the velocity of the user object, calculated
from the difference between the position of the user object at two points in time
(first and second gesture signals). Ex. 1002, 174.
Claim Language
5. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and a
physical model.
Rosenberg 373
Rosenberg 373 discloses that the reflex processes can
be used to provide a variety of haptic sensations to the user
through the user object 34 to simulate many different types
of tactile events, such as virtual damping, virtual
obstruction and virtual texture. Ex. 1004 at 19:39-46.
As one example, Rosenberg 373 discloses a kinematic
equation which calculates force based on the velocity of the
user object multiplied by a damping constant which can
simulate motion of object 34 along one degree of freedom
through a fluid or similar material. Id. at 17:46-51.
Rosenberg 373 also discloses various conditions that set
up a basic physical model or background sensations about
the user object including simulated stiffness, simulated
damping, simulated inertias, deadbands where simulated
forces diminish, and directional constraints dictating the
physical model's functionality. Id. at 31:17-23; see also
17:56-18:4 (movement through liquid); 40:16-41:12 (paddle
and ball); 39:45-49; 40:22-25 (gravity).
See also Ex. 1002, 175-178.
43
Thus, Rosenberg 373 discloses that the force algorithms used to generate
force values (dynamic interaction parameter) can use both sensor data (first and
second gesture signals) and mathematical equations for simulating physical effects
(physical model), such as motion through liquid, inertia or gravity. Ex. 1004 at
17:46-51; 31:17-23; 39:45-47; Ex. 1002, 178.
Claim Language
6. The method of
claim 1 wherein
generating a
dynamic interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and
an animation.
Rosenberg 373
Rosenberg 373 discloses that force feedback can be
accurately coordinated with other supplied feedback, such as
images on the video screen, and with user inputs such as
movement of the object. Ex. 1004 at 2:15-18.
Rosenberg 373 further discloses that the host computer 12
preferably synchronizes any appropriate visual feedback
with the application of forces on user object 34. For
example, in a video game application, the onset or start of
visual events, such as an object colliding with the user on
display screen 20 should be synchronized with the onset or
start of forces felt by the user which correspond to or
complement those visual events. Id. at 20:23-32; see also
19:46-53 (simulating virtual obstruction); 33:17-21 (same);
37:26-28 (paddle and ball).
See also Ex. 1002, 179-182.
Thus, Rosenberg 373 discloses haptic effects that are coordinated with
displayed animations. Ex. 1004 at 10:28-31; Ex. 1002, 181.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Rosenbergs disclosures. For example, it would have been obvious to a
POSITA to use an animation, such as an object colliding with the user on the
44
Rosenberg 373
Rosenberg 373 discloses that velocity sensors and/or
accelerometers can be used to directly measure velocities
and accelerations on object 34. Analog sensors can provide
an analog signal representative of the
position/velocity/acceleration of the user object in a
particular degree of freedom. Ex. 1004 at 11:19-24; see
also 21:3-7.
Rosenberg 373 discloses that [t]ypically, a sensor 28 is
provided for each degree of freedom along which object 34
can be moved. Id. at 10:17-18.
Rosenberg 373 further discloses [o]bject 34 is shown in
FIG. 3 as a joystick having a grip portion 126 for the user to
grasp. A user can move the joystick about axes A and B.
Id. at 28:25-27.
receiving a second
device sensor
signal; and
45
11:19-24; see also 21:3-7. Thus, a POSITA would understand that in a user object
with more than one degree of freedom, such as a joystick which can move along
both the x and y axes (e.g., id. at 28:25-27), the velocity and/or acceleration
sensors would generate separate signals for each degree of freedom (first and
second device sensor signals) representing the velocity and/or acceleration of the
object along the degree of freedom. Ex. 1002, 187.
If the Board finds that Rosenberg 373 does not disclose this limitation, it
nonetheless would have been obvious to a POSITA to practice this limitation in
view of Rosenbergs disclosures. For example, it would have been obvious to a
POSITA to include a velocity sensor for each axis of a joystick, for example, to
separately sense the velocity of the joystick along each axis. Ex. 1002, 188.
Motivation to do comes from, for example, Rosenbergs disclosure that
[t]ypically, a sensor 28 is provided for each degree of freedom along which object
34 can be moved. Id.; Ex. 1004 at 10:17-18.
Claim Language
wherein generating
a dynamic
interaction
parameter
comprises
generating a
dynamic interaction
parameter using the
first gesture signal
and the second
gesture signal and
Rosenberg 373
Rosenberg 373 discloses various conditions that set up a
basic physical model or background sensations about the
user object including simulated stiffness, simulated
damping, simulated inertias, deadbands where simulated
forces diminish, and directional constraints dictating the
physical model's functionality. Ex. 1004 at 31:17-24;
24:54-57 ( reflex processes may include conditions.)
Rosenberg 373 discloses a restoring spring condition, in
which force varies linearly over an appreciable portion of
the user object's displacement, and is proportional to the
46
47
Ex. 1004 at Fig. 23; 49:56-61. As illustrated in Fig. 23, the sluggish condition
includes parameters to specify a sluggish force along either the x-axis (Bx
parameters) or y-axis (By parameters). Likewise, the spring condition includes
parameters to specify a spring force along either the x-axis (Kx parameters) or yaxis (Ky parameters). Thus, it would have been obvious to a POSITA to apply a
sluggish condition along both the x and y axes, such that the force feedback varies
as a function of velocity along both axes (first and second device sensor signals),
as well as a spring condition along both the x and y axes, such that the force
feedback also varies as a function of position along both axes (first and second
gesture signals). Ex. 1002, 195. Motivation to do so comes from, for example,
Rosenbergs teaching that a condition command can be used for each provided
48
12.a. a haptic
output device;
Rosenberg 373
Rosenberg 373 discloses that the present invention is
directed to controlling and providing force feedback to a
user operating a human/computer interface device.) Ex.
1004 at 3:25-27; see also 6:18-20; Fig. 1.
See also Ex. 1002, 196.
Rosenberg 373 discloses force feedback interface device
14 that includes actuator 30 (haptic output device). Ex.
1004 at Figs. 1, 3.
Rosenberg 373 further discloses that [a]ctuators 30
transmit forces to user object 34 of the interface device 14 in
one or more directions alone one or more degrees of
freedom in response to signals received from
microprocessor 26. Id. at 11:54-57; see also 11:57-12:24.
method for receiving first and second gesture signals and generating a dynamic
interaction parameter. Rosenberg 373 further discloses that the method is
implemented using a set of instructions executed on microprocessor 16. Ex.
1004 at 6:47-51. A POSITA would recognize that this set of instructions is a drive
module, i.e. a set of instructions executed by the processor. Ex. 1002, 202.
Claim Language
12.c. a drive circuit
electronically
coupled to the drive
module and the
haptic output device
for applying a drive
signal to the haptic
output device
according to the
dynamic interaction
parameter.
Rosenberg 373
Rosenberg 373 discloses that the force command can be
output as an actual force signal that is merely relayed to an
actuator 30 by microprocessor 26; or, the force command
can be converted to an appropriate form by microprocessor
26 before being sent to actuator 30. Id. at 20:1-5; see also
21:65-66; 25:18-19.
Rosenberg 373 further discloses that [a]ctuator interface
38 can be optionally connected between actuators 30 and
microprocessor 26. Interface 38 converts signals from
microprocessor 26 into signals appropriate to drive actuators
30 (drive signals). Id. at 12:31-38; see also Fig. 2.
Also see discussion of applying a drive signal to the haptic
output device according to the dynamic interaction
parameter in claim 1.d.
See also Ex. 1002, 203-206.
50
(haptic output device). Ex. 1002, 206. Rosenberg 373 also discloses an
embodiment in which actuator interface 38 applies drive signals to the haptic
actuators. Ex. 1004 at 12:31-38. A POSITA would understand that actuator
interface 38 also satisfies the drive circuit limitation, because it is comprised of
circuitry for applying drive signals in the form of voltages to the haptic actuators
(haptic output device). Id.; see also Fig. 2 (illustrating circuitry for actuator
interface 38); Ex. 1002, 206.
Claim Language
13. The system of claim 12 wherein the first or
second gesture signal comprises a vector signal.
15. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter from a difference
between the first gesture signal and the second
gesture signal.
16. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and a physical
model.
17. The system of claim 12 wherein the drive
module comprises a drive module for generating a
dynamic interaction parameter using the first gesture
signal and the second gesture signal and an
animation.
18.a. The system of claim 12 wherein the drive
module comprises a drive module for receiving a
first device sensor signal,
18.b. receiving a second device sensor signal, and
18.c. generating a dynamic interaction parameter
using the first gesture signal and the second gesture
51
Rosenberg 373
See claim 2.
See claim 4.
Also see discussion of drive
module in claim 12.b.
See claim 5.
Also see discussion of drive
module in claim 12.b.
See claim 6.
Also see discussion of drive
module in claim 12.b.
See limitation 7.a.
Also see discussion of drive
module in claim 12.b.
See limitation 7.b.
See limitation 7.c.
Claim Language
23.pre. A nontransitory computer
readable medium
having instructions
stored thereon that,
when executed by a
processor, causes
the processor to
produce a haptic
effect, the
instructions
comprising:
Rosenberg 373
Rosenberg Rosenberg 373 discloses host computer
system 12 which preferably includes a host
microprocessor 16, random access memory (RAM) 17, readonly memory (ROM) 19. Ex. 1004 at 6:47-51; see also
6:55-57 (the microprocessor preferably retrieves and stores
instructions in RAM 17 and ROM 19).
See limitation 1.pre for discussion of method for producing
a haptic effect.
See also Ex. 1002, 214-217.
See claim 6.
53
Id. at 2:54-56. The touch control includes a touch input device operative to
input a position signal to a processor of said computer based on a location of user
contact on the touch surface. Id. at 2:7-12. Rosenberg 846 further discloses
providing haptic effects at least in part based on the location of the finger on the
pad or dependent upon the current velocity of the users finger (or other object)
on the touchpad. Id. at 5:14-16; 11:59-62.
In addition to the specific proposed combinations of Rosenberg 373 and
Rosenberg 846 discussed below, it generally would have been obvious to a
POSITA to combine the references, at least because (a) each is assigned to the
same assignee, Immersion (Ex. 1004 at cover; Ex. 1006 at cover.), (b) each has the
same lead inventor, Louis B. Rosenberg (id.), (c) each focuses on methods of
providing haptic feedback in an electronic device (Ex. 1004 at 3:25-31; Ex. 1006 at
1:18-22.); (d) each discloses a user interface device that provides an indication of
position and/or velocity (Ex. 1004 at 3:44-47; 15:50-53; Ex. 1006 at 6:26-29; 4:4346.); and (e) each discloses generating a haptic effect that varies based upon
changes in position and/or velocity (Ex. 1004 at 17:2-17; Ex. 1006 at 5:14-16;
11:59-62.). Ex. 1002, 231.
As discussed below, claims 3, 14 and 25 are obvious in view of Rosenberg
373 and Rosenberg 846.
Claim Language
3. The method of
Rosenberg 846
Rosenberg 846 discloses that the present invention relates
54
1002, 238. Indeed, Rosenberg 876 likewise discloses that the touch input
device can be a touchpad or touch screen, and that the user can contact the
touch surface with a finger, a stylus, or other object. Ex. 1006 at 2:22-26.
Motivation to do so arises at least from the fact that both patents describe user
interface devices having a planar touch sensitive surface. Ex. 1004 at 1:36-41; Ex.
1006 at 2:22-26; Ex. 1002, 238. Additional motivation to do so arises from the
fact that both patents disclose user interface devices capable of generating signals
representing position. Ex. 1004 at 3:44-47; Ex. 1006 at 6:26-29. Thus, a POSITA
would recognize that the Rosenberg 876 touchscreen could be used as the user
interface device (user object 34) in the Rosenberg 373 system in the manner
described above in connection with claim 1. Ex. 1002, 238.
Claim Language
Rosenberg 846
14. The system of claim 12 wherein the first or
See claim 3.
second gesture signal comprises an on-screen signal.
25. The non-transitory computer readable medium
See claim 3.
of claim 23, wherein the first or second gesture
signal comprises an on-screen signal.
IV.
because grounds 1 based on Burrough and grounds 203 based upon Rosenberg 373
disclose the claims differently. For example, Burrough discloses gestures using a
multi-touch touchscreen, while Rosenberg 373 discloses gestures on other types of
user interface devices, such as joysticks. Burrough further discloses generating a
56
CONCLUSION
For the foregoing reasons, Petitioner requests that the Board institute trial
and cancel claims 1-7, 12-18, and 23-29 of the 571 patent.
Respectfully Submitted,
/James M. Heintz 41828/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
57
Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.
58
Respectfully Submitted,
/James M. Heintz/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.
59
CERTIFICATE OF SERVICE
The undersigned hereby certifies that a copy of the foregoing Petition for
Inter Partes Review and all Exhibits and other documents filed together with the
petition were served on July 7, 2016, via United Parcel Service, directed to the
attorneys of record for the patent at the following address:
Immersion Corporation
50 Rio Robles
San Jose, CA 95134
Respectfully Submitted,
/James M. Heintz/
James M. Heintz
Reg. No. 41,828
DLA Piper LLP (US)
11911 Freedom Drive, Suite 300
Reston, VA 20190
Apple-Immersion-IPRs@dlapiper.com
Phone: 703-773-4148
Fax: 703-773-5200
Robert Buergi
Reg. No. 58,125
DLA Piper LLP (US)
2000 University Ave
East Palo Alto, CA 94303
robert.buergi@dlapiper.com
Phone: 650-833-2407
Fax: 650-687-1144
Attorneys for Petitioner Apple Inc.
60