Raymond QuanSeattle, WA

Raymond Quan Phones & Addresses

Seattle, WA

Mentions for Raymond Quan

Career records & work history

Medicine Doctors

Raymond Quan

Specialties:
Radiology, Vascular & Interventional Rad
Work:
Kaiser Permanente Medical GroupKaiser South Bay Medical Center
25825 Vermont Ave, Harbor City, CA 90710
(310) 325-5111 (phone) (310) 257-6499 (fax)
Site
Education:
Medical School
University of California, Davis School of Medicine
Graduated: 1980
Languages:
English
Description:
Dr. Quan graduated from the University of California, Davis School of Medicine in 1980. He works in Harbor City, CA and specializes in Radiology and Vascular & Interventional Rad. Dr. Quan is affiliated with South Bay Medical Center.

Raymond Quan resumes & CV records

Resumes

Raymond Quan Photo 27

Software Development Engineer

Location:
Seattle, WA
Industry:
Computer Software
Work:
Microsoft
Software Development Engineer
Intel Corporation Aug 2005 - Sep 2008
Component Design Engineer
Intel Corporation Aug 2000 - Aug 2005
System Validation Engineer
Education:
University of Washington 1995 - 2000
Bachelors, Bachelor of Science, Computer Engineering
Raymond Quan Photo 28

Raymond Quan

Raymond Quan Photo 29

Raymond Quan

Raymond Quan Photo 30

Software Development Engineer At Microsoft

Position:
Software Development Engineer at Microsoft
Location:
Greater Seattle Area
Industry:
Computer Software
Work:
Microsoft since Oct 2008
Software Development Engineer
Intel Aug 2005 - Sep 2008
Component Design Engineer
Intel Corporation Aug 2000 - Aug 2005
System Validation Engineer
Education:
University of Washington 1995 - 2000
BS, Computer Engineering

Publications & IP owners

Us Patents

Navigational Aid For A Hinged Device Via Semantic Abstraction

US Patent:
2018021, Aug 2, 2018
Filed:
Jan 30, 2017
Appl. No.:
15/419287
Inventors:
- Redmond WA, US
Raymond Quan - Shoreline WA, US
Christian Klein - Duvall WA, US
Assignee:
Microsoft Technology Licensing, LLC - Redmond WA
International Classification:
G06F 3/0483
G06F 1/16
G06F 3/0488
G06F 17/21
Abstract:
Techniques for navigational aid for a hinged device via semantic abstraction are described. Generally, the techniques described herein improve a user experience when the user is navigating through content, such as user-generated content in an electronic document. For example, the techniques described herein semantically abstract authored content in an electronic document to provide abstracted content. In implementations, abstracted content includes abstracted pages that each represent a different section of the authored content. When a user scans through an electronic document, rather than scanning page by page, techniques described herein instead navigate through displays of the abstracted pages. In addition, a hinge between different displays can be used as an input mechanism to control a speed of navigating through the abstracted pages to allow a user to more easily locate specific sections of the authored content

Input Based On Interactions With A Physical Hinge

US Patent:
2018011, Apr 26, 2018
Filed:
Oct 25, 2016
Appl. No.:
15/333814
Inventors:
- Redmond WA, US
Raymond Quan - Seattle WA, US
Ricardo A. Espinoza Reyes - Redmond WA, US
Gregg Robert Wygonik - Duvall WA, US
Assignee:
Microsoft Technology Licensing, LLC - Richmond WA
International Classification:
G06F 3/0338
G06F 1/16
G06F 3/038
G06F 3/0346
Abstract:
Techniques for input based on interactions with a physical hinge are described. Generally, a new class of interactions involves user manipulation of a physical hinge in order to provide input to a computing device. These hinge-based interactions provide input to a computing system that can be leveraged to initiate one or more system-level commands or operations, initiate transitions between discrete views of content, interact with content displayed via one or more display devices, and so on. In an example, a sequence of two or more consecutive hinge angle changes is recognized as a hinge gesture to perform a particular operation, such as a transition between a single-tasking state and a multitasking state.

Gesture Language For A Device With Multiple Touch Surfaces

US Patent:
2018006, Mar 8, 2018
Filed:
Sep 6, 2016
Appl. No.:
15/257179
Inventors:
- Redmond WA, US
Gregg Robert Wygonik - Duvall WA, US
Ricardo A. Espinoza Reyes - Redmond WA, US
Raymond Quan - Seattle WA, US
Sophors Khut - Seattle WA, US
Assignee:
Microsoft Technology Licensing, LLC - Redmond WA
International Classification:
G06F 3/0488
G06F 3/0346
G06F 1/16
G06F 3/0484
G06F 3/041
Abstract:
A gesture language for a device with multiple touch surfaces is described. Generally, a series of new touch input models is described that includes touch input interactions on two disjoint touch-sensitive surfaces. For example, a mobile device can include a primary display on a “front” side of the device, and a secondary display or touch-sensitive surface on the “back” side of the device, such as a surface that is opposite the primary display. Accordingly, the gesture language can include a series of “back touch” interactions with the touch-sensitive surface on the backside of the device. Example interactions include direct and indirect touch input on the back side, as well as simultaneous touch input on both sides of the device.

Hover Controlled User Interface Element

US Patent:
2016002, Jan 28, 2016
Filed:
Oct 8, 2015
Appl. No.:
14/878153
Inventors:
- Redmond WA, US
Dan Hwang - New Castle WA, US
Bo-June Hsu - Woodinville WA, US
Raymond Quan - Seattle WA, US
Eric Badger - Redmond WA, US
Jose Rodriguez - Seattle WA, US
Peter Gregory Davis - Kirkland WA, US
International Classification:
G06F 3/0489
G06F 3/0488
Abstract:
Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state. Selectively controlling the activation, display, and deactivation of the user interface elements includes allocating display space on the input/output interface to the user interface elements when they are needed for an operation on the apparatus and selectively reclaiming space on the input/output interface allocated to the user interface elements when they are not needed for an operation on the apparatus.

Hover Controlled User Interface Element

US Patent:
2015008, Mar 19, 2015
Filed:
Sep 16, 2013
Appl. No.:
14/027533
Inventors:
- Redmond WA, US
Dan Hwang - New Castle WA, US
Paul Hsu - Woodinville WA, US
Raymond Quan - Seattle WA, US
Eric Badger - Redmond WA, US
Jose Rodriguez - Seattle WA, US
Peter Gregory Davis - Kirkland WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/0488
US Classification:
715767, 715773
Abstract:
Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state. Selectively controlling the activation, display, and deactivation of the user interface elements includes allocating display space on the input/output interface to the user interface elements when they are needed for an operation on the apparatus and selectively reclaiming space on the input/output interface allocated to the user interface elements when they are not needed for an operation on the apparatus.

NOTICE: You may not use PeopleBackgroundCheck or the information it provides to make decisions about employment, credit, housing or any other purpose that would require Fair Credit Reporting Act (FCRA) compliance. PeopleBackgroundCheck is not a Consumer Reporting Agency (CRA) as defined by the FCRA and does not provide consumer reports.