Shane F Duffy, Age 56Topanga, CA

Shane Duffy Phones & Addresses

Topanga, CA

Los Angeles, CA

Mentions for Shane F Duffy

Career records & work history

Lawyers & Attorneys

Shane Duffy Photo 1

Shane Duffy - Lawyer

ISLN:
1000970216
Admitted:
2020

Publications & IP owners

Us Patents

Elastomeric Enteral Feeding Pump And Filling System

US Patent:
2023003, Feb 2, 2023
Filed:
Jul 30, 2021
Appl. No.:
17/389624
Inventors:
- Alpharetta GA, US
Shane A. Duffy - Irvine CA, US
Donald J. McMichael - Roswell GA, US
Hilton M. Kaplan - New York NY, US
International Classification:
A61J 15/00
Abstract:
A non-electrically driven enteral feeding pump including an expandable elastomeric bladder is provided. The enteral feeding pump includes a fluid delivery tube. The fluid delivery tube controls flow rate of fluid from the pump. The present invention further describes an enteral feeding pump assembly including the portable enteral feeding pump and a peristaltic pump. The peristaltic pump can be configured to be operatively coupled to the elastomeric bladder of the enteral feeding pump for transferring fluid from a reservoir external to the elastomeric bladder into a chamber of the elastomeric bladder.

System And Method For Identifying And Navigating Anatomical Objects Using Deep Learning Networks

US Patent:
2020013, May 7, 2020
Filed:
May 17, 2018
Appl. No.:
16/618836
Inventors:
- Alpharetta GA, US
Shane A. Duffy - Irvine CA, US
International Classification:
A61B 8/00
A61B 8/08
G06N 3/08
G06N 20/00
G06F 3/01
Abstract:
A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further, the method includes collecting data relating to operation of the probe during the scanning, identifying, and navigating steps. Moreover, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps. In addition, the method includes generating a probe visualization guide for an operator based on the deep learning network. Thus, the method also includes displaying the probe visualization guide to the operator via a user display of the imaging system, wherein the probe visualization guide instructs the operator how to maneuver the probe so as to locate the anatomical object. In addition, the method also includes using haptic feedback in the probe to guide the operator to the anatomical object of the patient.

Articulating Arm For Analyzing Anatomical Objects Using Deep Learning Networks

US Patent:
2020002, Jan 30, 2020
Filed:
Mar 12, 2018
Appl. No.:
16/500456
Inventors:
- Alpharetta GA, US
Shane A. Duffy - Irvine CA, US
International Classification:
A61B 8/00
G06N 3/08
G06N 3/04
G16H 50/20
Abstract:
The present invention is directed to a method for scanning, identifying, and navigating anatomical object(s) of a patient via an articulating arm of an imaging system. The method includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object, and navigating the anatomical object via the probe. The method also includes collecting data relating to the anatomical object during the scanning, identifying, and navigating steps. Further, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps relating to the anatomical object. Moreover, the method includes controlling the probe via the articulating arm based on the deep learning network.

System And Method For Navigation To A Target Anatomical Object In Medical Imaging-Based Procedures

US Patent:
2019035, Nov 21, 2019
Filed:
Jun 29, 2017
Appl. No.:
15/736129
Inventors:
- Alpharetta GA, US
Shane A. Duffy - Irvine CA, US
Kenneth C. Hsu - Tustin CA, US
International Classification:
G06T 7/73
A61B 34/20
Abstract:
The present invention is directed to a system and method for providing navigational directions to a user to locate a target anatomical object during a medical procedure via a medical imaging system. The method includes selecting an anatomical region surrounding the object; generating a plurality of real-time two-dimensional images of scenes from the anatomical region and providing the plurality of images to a controller; developing and training a deep learning network to automatically detect and identify the scenes from the anatomical region; automatically mapping each of the plurality of images from the anatomical region based on a relative spatial location and a relative temporal location of each of the identified scenes in the anatomical region via the deep learning network; and providing directions to the user to locate the object during the medical procedure based on the relative spatial and temporal locations of each of the identified scenes.

System And Method For Automatic Detection, Localization, And Semantic Segmentation Of Anatomical Objects

US Patent:
2019031, Oct 10, 2019
Filed:
Jun 29, 2017
Appl. No.:
16/315237
Inventors:
- Alpharetta GA, US
Kenneth C. Hsu - Tustin CA, US
Shane A. Duffy - Irvine CA, US
Dominique J. Fantasia - Irvine CA, US
Steve S. Khalaj - Laguna Hills CA, US
Hasnain Somji - Irvine CA, US
Aimee T. Bui - Aliso Viejo CA, US
Shirzad Shahriari - Irvine CA, US
Ambar A. Avila - Irvine CA, US
Joost L. Mulders - Costa Mesa CA, US
International Classification:
G06T 7/00
G06T 7/73
G06T 7/11
G06K 9/72
G06K 9/62
Abstract:
The present invention is directed to a system and method for automatic detection, localization, and semantic segmentation of at least one anatomical object in a parameter space of an image generated by an imaging system. The method includes generating the image via the imaging system and providing the image of the anatomical object and surrounding tissue to a processor. Further, the method includes developing and training a parameter space deep learning network comprising convolutional neural networks to automatically detect the anatomical object and the surrounding tissue of the parameter space of the image. The method also includes automatically locating and segmenting, via additional convolutional neural networks, the anatomical object and surrounding tissue of the parameter space of the image. Moreover, the method includes automatically labeling the identified anatomical object and surrounding tissue on the image. Thus, the method also includes displaying the labeled image to a user in real time.

Motor-Assisted Needle Guide Assembly For Ultrasound Needle Placement

US Patent:
2019015, May 30, 2019
Filed:
Aug 2, 2017
Appl. No.:
16/321154
Inventors:
- Alpharetta GA, US
Shane A. Duffy - Irvine CA, US
International Classification:
A61B 8/08
A61B 17/34
Abstract:
The present invention is directed to an ultrasound imaging system having a motor-assisted needle guide assembly for easier needle placement during an ultrasound-guided medical procedure. The ultrasound imaging system includes an ultrasound probe having a transducer housing, a transducer transmitter, a needle guide assembly communicatively coupled to the ultrasound probe, at least one actuator component configured with the needle guide assembly, and a controller. Thus, the controller is configured to determine an insertion angle and a lateral position for the needle guide assembly with respect to the ultrasound probe based on the target site and control the actuator component based on the insertion angle and the lateral position so as to locate the needle guide assembly at the target site during a medical procedure.

NOTICE: You may not use PeopleBackgroundCheck or the information it provides to make decisions about employment, credit, housing or any other purpose that would require Fair Credit Reporting Act (FCRA) compliance. PeopleBackgroundCheck is not a Consumer Reporting Agency (CRA) as defined by the FCRA and does not provide consumer reports.