找回密碼
 申請討論區帳戶
樓主: WCYue

Unistellar 文件

[複製鏈接]
 樓主| 發表於 2022-7-14 17:55:06 | 顯示全部樓層
本帖最後由 WCYue 於 2022-7-14 18:03 編輯

其它相關分類

G02B23/10 https://patents.google.com/?q=G02B23%2f10

Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices involving prisms or mirrors reflecting into the field of view additional indications, e.g. from collimator

G01C21/02 https://patents.google.com/?q=G01C21%2f02

Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means

G02B27/0093 https://patents.google.com/?q=G02B27%2f0093

Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

G02B27/026 https://patents.google.com/?q=G02B27%2f026

Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies and a display device, e.g. CRT, LCD, for adding markings or signs or to enhance the contrast of the viewed object

G09B27/00 https://patents.google.com/?q=G09B27%2f00

Planetaria; Globes

G09B27/04 https://patents.google.com/?q=G09B27%2f04

Star maps

G02B2027/0138 https://patents.google.com/?q=G02B2027%2f0138

Head-up displays characterised by optical features comprising image capture systems, e.g. camera

G02B2027/014 https://patents.google.com/?q=G02B2027%2f014

Head-up displays characterised by optical features comprising information/image processing systems

G02B2027/0141 https://patents.google.com/?q=G02B2027%2f0141

Head-up displays characterised by optical features characterised by the informative content of the display
 樓主| 發表於 2022-7-14 18:10:26 | 顯示全部樓層
Method and system for stereoscopic vision of a celestial observation scene

Abstract translated from French

The invention relates to a method for stereoscopic vision of a celestial observation scene, comprising the steps consisting in: - acquiring an image of the observation scene by means of an optical vision device, - generating two homologous images of one or more stars contained in the observation scene, respectively a first image and a second image, said first image and said second image being offset from each other,- displaying the first images and the second images of way to obtain a stereoscopic vision of the observation scene. This process is remarkable in that it comprises the following steps: - identify one or more stars in the acquired image of the observation scene, - evaluate the distance between each identified star and the device, - for one or more identified stars, horizontally shifting the first image and the second image by a distance which is a function of the evaluated distance of said star.

Unistellar FR3112401A1.pdf (1.6 MB, 下載次數: 23)

Worldwide applications
2020  FR

Application FR2007258A events
2020-07-09 Application filed by Unistellar
2020-07-09 Priority to FR2007258
2020-07-09 Priority to FR2007258A
2022-01-14 Publication of FR3112401A1
Status Pending
 樓主| 發表於 2022-7-14 19:50:54 | 顯示全部樓層
本帖最後由 WCYue 於 2022-7-14 19:58 編輯

Description translated from Method and system for stereoscopic vision of a celestial observation scene

Technical area

The objects of the invention are a method and a system for the stereoscopic vision of a celestial observation scene.

The invention relates to the technical field of optical instruments for observing stars.

State of the art.

Stereoscopy makes it possible to reproduce a perception of relief from two flat images. It is based on the fact that the human perception of relief is formed in the brain when it reconstructs a single image from the perception of the two flat and different images coming from each eye.

Methods of stereoscopic vision of a scene observed by an optical vision device are known from the prior art. The patent document US2016/0170223 (LEDERMAN) describes for example a binocular telescope allowing stereoscopic vision of a celestial observation scene, that is to say of stars located at “infinity”. For each star observed, a first image is formed in one eyepiece and a second image is formed in the other eyepiece, these two images being offset horizontally from one another. This shift is achieved by means of one or more light shifters configured to horizontally shift part of the light incident on at least one of the eyepieces, so that the images formed in the left eye and the eye right are slightly different. The shift between images can be controlled by varying the shift angle of the light shifters. The resulting image is perceived as a three-dimensional image or one in which the observed object appears offset towards the viewer relative to the object's surroundings, simulating a three-dimensional effect.

In this document US2016/0170223, the shifting of the images is arbitrary, in the sense that it is the arrangement of the light shifting devices which induces the shifting of the images. Thus, the stars located in the center of the observation scene have their images which are shifted so as to perceive them in the "foreground". And the stars located on the edges of the observation scene have their images which are shifted so as to perceive them in "background". However, this arbitrary highlighting does not reflect the astronomical reality, because the stars located in the center of the observation scene can actually be located much further than the stars located on the periphery of said scene.

An object of the invention is to remedy all or part of the aforementioned drawbacks.

An additional objective of the invention is to propose a method making it possible to obtain, in a simple and rapid manner, a stereoscopic vision of a celestial observation scene which reflects astronomical reality.

Another objective of the invention is to propose a method making it possible to obtain, in a simple and rapid manner, a stereoscopic vision of a celestial observation scene which is more precise than those obtained with the techniques of the prior art.

Yet another object of the invention is to provide a system for stereoscopic vision of a celestial observation scene which is simple in design, inexpensive and easy to use.

Presentation of the invention.

The solution proposed by the invention is a method for stereoscopic vision of a celestial observation scene, comprising the steps consisting in:
- acquire an image of the observation scene by means of an optical vision device,
- generating two homologous images of one or more stars contained in the observation scene, respectively a first image and a second image, said first image and said second image being offset from each other,
- displaying the first images and the second images so as to obtain a stereoscopic vision of the observation scene.
This process is remarkable in that it comprises the following steps:
- identify one or more stars in the acquired image of the observation scene,
- evaluate the distance between each identified star and the device,
- for one or more identified stars, horizontally shifting the first image and the second image by a distance which is a function of the evaluated distance of said star.

The relief effect is now given by an image shift resulting from the analysis of the distances of the stars identified in the image of the observation scene. It is therefore the distance separating a star from the optical vision device that determines the relief effect and no longer the simple position of this star in the image of the observation scene. The observer can thus simply and quickly obtain a stereoscopic vision of the scene which accurately reflects astronomical reality.

Other advantageous characteristics of the method according to the invention are listed below. Each of these characteristics can be considered alone or in combination with the remarkable characteristics defined above. Each of these characteristics contributes, where appropriate, to the resolution of specific technical problems defined further on in the description and to which the remarkable characteristics defined above do not necessarily participate. The latter may be the subject, where appropriate, of one or more divisional patent applications:
- According to one embodiment, the identification of a star in the image of the observation scene is carried out by means of a computer application for recognizing objects.
- According to one embodiment, the method comprises the steps consisting in: - saving star recordings in a database, each star recording being selectable and associated with celestial coordinates of said star in real time and with data identification; - select a star record in the database; - activate a device orientation device so that said device is oriented towards said star corresponding to the selected recording, according to the celestial coordinates associated with said recording.
- According to one embodiment, the method comprises the steps consisting in: - actuating the orientation device so that the star corresponding to the selected record is in a predefined area of the image acquired from the observation scene; - identify said body by correlating the identification data associated with the selected recording and the position of this body in the acquired image of the observation scene.
- According to one embodiment, the orientation of the device is also carried out by correlating terrestrial location data of said device and orientation data of said device.
- According to one embodiment, the method comprises a step consisting in identifying one or more other stars in the image of the observation scene, which identification is carried out by means of a computer application for recognizing objects.
- According to one embodiment, the method comprises the steps consisting in: - recording in a database recordings of stars, each recording of a star being associated with data on the distance of said star relative to the Earth; - for each star identified, search the database for the corresponding record and extract the distance data; - evaluate the distance between each identified star and the device, according to the distance data extracted.
- According to one embodiment, the method comprises a step consisting in applying a mathematical function to determine the value of the offset distances of the images from the evaluated distances, which function varies according to the standard deviation of said evaluated distances.
- According to one embodiment, the method comprises a step consisting in evaluating in a relative manner the distance between each identified star and the device so as to classify said identified stars in ascending or decreasing order of distance from said device, which evaluation relative is carried out according to the nature of said identified stars.
- According to one embodiment, the method comprises a step consisting in shifting the first image and the second image of an identified star by a fixed value, which value depends on the classification of said star.
- According to one embodiment, the method comprises the steps of: - defining a reference plane; - for an identified star located behind the reference plane, shifting the first image and the second image so that said images have a positive parallax; - for an identified star located in front of the reference plane, shifting the first image and the second image so that said images have a negative parallax; - for an identified star located in the reference plane, not shifting the first image and the second image so that said images have zero parallax.
- According to one embodiment, the method comprises a step consisting in defining the reference plane at infinity.
- According to one embodiment, the method comprises a step consisting in making the reference plane coincide with an identified star.
- According to one embodiment, the method comprises the step of displaying the first images on a first screen installed in a first eyepiece of a display device integrated or connected to the device and displaying the second images on a second screen installed in a second eyepiece of said viewing device, so as to obtain a stereoscopic view of the observation scene when an observer positions his eyes in front of each of said eyepieces.
- According to one embodiment, the method comprises the steps of: - displaying on a screen the first images and the second images in the form of an anaglyph image; - place anaglyphic filters in front of the eyes of the observer so that said observer has a stereoscopic vision of the observation scene when he looks at the screen.

Another aspect of the invention relates to a system for stereoscopic vision of a celestial observation scene, the system comprising:
- an optical vision device adapted to acquire an image of the observation scene,
- a means for generating two homologous images of one or more stars contained in the observation scene, respectively a first image and a second image, said first image and said second image being offset from each other,
- A means for displaying the first images and the second images so as to obtain a stereoscopic view of the observation scene.
This system is remarkable in that the device includes a processing unit suitable for:
- identify one or more stars in the acquired image of the observation scene,
- evaluate the distance between each identified star and the device,
- for one or more identified stars, horizontally shifting the first image and the second image by a distance which is a function of the evaluated distance of said star.

Other advantageous characteristics of the system according to the invention are listed below. Each of these characteristics can be considered alone or in combination with the remarkable characteristics defined above. Each of these characteristics contributes, where appropriate, to the resolution of specific technical problems defined further on in the description and to which the remarkable characteristics defined above do not necessarily participate. The latter may be the subject, where appropriate, of one or more divisional patent applications:
- According to one embodiment, the device integrates or and connected to a display device comprising two eyepieces, respectively a first eyepiece and a second eyepiece, each eyepiece being provided with a display screen; the processing unit is adapted to display the first images on the screen of the first eyepiece and to display the second images on the screen of the second eyepiece.
- According to one embodiment, the apparatus comprises a screen on which the first images and the second images are displayed in the form of an anaglyph image.
- According to one embodiment, the device is a telescope.
 樓主| 發表於 2022-7-14 20:00:55 | 顯示全部樓層
Description translated from Method and system for stereoscopic vision of a celestial observation scene

Brief description of figures

Other advantages and characteristics of the invention will appear better on reading the description of a preferred embodiment which will follow, with reference to the appended drawings, produced by way of indicative and non-limiting examples and in which:
schematizes an optical vision system suitable for implementing the invention, according to a first embodiment.
illustrates an image shift according to one embodiment.
illustrates a shift of the images according to another embodiment.
illustrates the images formed in two eyepieces.
schematizes an optical vision system suitable for implementing the invention, according to a second embodiment.
illustrates the images displayed on a screen of the device of the .
schematizes an optical vision system suitable for implementing the invention, according to a third embodiment.
schematizes an optical vision system suitable for implementing the invention, according to a fourth embodiment.

Description of embodiments

The method and the system which are the subject of the invention are capable of generating manipulations of physical elements, in particular (electrical) signals and digital data, capable of being stored, transferred, combined, compared, etc., and allowing achieve a desired result.

The invention implements one or more computer applications executed by computer equipment. For the sake of clarity, it should be understood within the meaning of the invention that "a piece of equipment does something" means "the computer application executed by a processing unit of the piece of equipment does something". Just as "the computer application does something" means "the computer application executed by the processing unit of the equipment does something".

Again for the sake of clarity, the present invention may refer to one or more “computer processes”. These correspond to the actions or results obtained by the execution of instructions from one or more computer applications. Also, it must also be understood within the meaning of the invention that “a computer process is adapted to do something” means “the instructions of a computer application executed by a processing unit do something”.

Again for the sake of clarity, the following clarifications are made to certain terms used in the description and the claims:
- "Computer resource" can be understood in a non-limiting way as: component, hardware, software, file, connection to a computer network, amount of RAM memory, hard disk space, bandwidth, processor speed, number of CPUs, etc. .
- "Computer server" can be understood in a non-limiting way as: computer device (hardware or software) comprising computer resources to perform the functions of a server and which offers services, computer, plurality of computers, virtual server on the internet , virtual server on cloud, virtual server on a platform, virtual server on local infrastructure, server networks, cluster, node, server farm, node farm, etc.
- "Processing unit" can be understood in a non-limiting manner as: processor, microprocessors, CPU (for Central Processing Unit).
- "Computer application" can be understood as: software, computer program product, computer program or software, the instructions of which are notably executed by a processing unit.
- "Communication network" can be understood in a non-limiting way as: internet network, cellular network, satellite network, etc. It is a set of computer equipment linked together to exchange, securely or not, information and/or data according to a communication protocol (ISDN, Ethernet, ATM, IP, CLNP, TCP, HTTP, etc.) and/or through network technologies such as, but not limited to, GSM, EDGE, 2G, 3G, 4G, 5G, etc.
- "Database" can be understood in a non-limiting manner as a structured and organized set of data recorded on media accessible by computer equipment and in particular by computer servers, and which can be queried, read and updated. Data can be inserted, retrieved, modified and/or destroyed. Management and access to the database can be provided by a set of computer applications that constitute a database management system (DBMS).
- As used herein, unless otherwise specified, the use of the ordinal adjectives "first", "second", etc., to describe an object merely indicates that different occurrences of similar objects are mentioned and does not imply that the objects thus described must be in a given sequence, whether in time, in space, in a classification or in any other way.
- Similarly, the use of the adjectives "right/left", "in front/behind" etc., makes it possible to simply describe the position of an object in the configuration of the appended figures, but does not necessarily imply that in practice , similar objects are in the same position.
- “X and/or Y” means: X alone or Y alone or X+Y.
- In general, it will be appreciated that the various drawings are not drawn to scale from one figure to another nor within a given figure, and in particular that the objects are arbitrarily drawn to facilitate reading drawings.

The device 1 object of the invention is mainly used for the observation of stars such as planets, comets, nebulae, galaxies, and in general celestial objects - or astronomical - close or distant (in particular deep sky objects or “deep sky objects” in English).

It is preferably a telescope, but the device can also be in the form of a camera or a video camera. For the sake of clarity, and by way of illustrative example only, the rest of the description only refers to a telescope suitable for observing a celestial observation scene, that is to say part of the celestial vault.

According to the embodiment of the , the telescope 1 notably comprises a hollow body 10, an optical system 11 and a sensor 12.

The hollow body 10 is for example in the form of a hollow tube of circular section, but could be a tube of oval, square, octagonal or other section. It is specified that the hollow body 10 is not necessarily tubular in shape, but may be conical in shape, or formed from portions of tubes or cones for example. The hollow body 10 can be made of metal, plastic material, composite material, etc. For example, its length is between 200 mm and 1000 mm, its diameter is between 50 mm and 500 mm and its thickness is between 1 mm and 10 mm.

The light rays R coming from the stars A1, A2, A3 contained in the observation scene S enter the tube 10 then are reflected by a primary mirror 11, which is advantageously in the form of a concave parabolic mirror with pure reflection . The light rays R reflected by the mirror 11 form, in a focal plane Pf, an image of the observed stars A1, A2, A3.

The sensor 12 is centered on the optical axis and placed in the focal plane Pf so as to acquire the image of the observation scene. The sensor 12 is preferably a CCD (for the English acronym Charged Coupled Device) or CMOS (for the English acronym Complementary Metal Oxide Semiconductor) sensor comprising an arrangement of pixels (preferably by generating color images). This type of sensor 12 has reduced dimensions, which allows it to be easily installed in the tube 10, while maintaining optimum luminosity. The diameter of the sensor 12 is for example between 15 mm and 30 mm.

The image data generated by the sensor 12 are transmitted to a processing unit 13. The connection between the sensor 12 and the processing unit 13 can be made by wire, or by a wireless link, for example according to a proximity communication protocol, such as, by way of non-limiting example, the Bluetooth®, Wifi®, ZigBee® protocol. The processing unit 13 is adapted to process the image data in real time, according to a computer process described later in the description, so as to allow a stereoscopic view of the observation scene S.

According to one embodiment, the telescope 1 also includes one or more of the following computing resources: one or more memories 14, a wireless communication module 15, a network interface 16.

The memory or memories 14 must be considered as a storage device also suitable for storing data and/or data files. This can be native memory or patched memory such as a Secure Digital (SD) card.

The wireless communication module 15 is suitable for receiving and transmitting radiofrequency signals to communicate wirelessly with other equipment. These radiofrequency signals are preferably signals using a Bluetooth® protocol, other protocols such as ISM, Wifi®, ANT, ZIGBEE® or other, which can however be used.

The network interface 16 is suitable for establishing communication between the telescope 10 and a remote computer server and/or other remote electronic equipment, via a computer communication network. This network interface 16 can be directly integrated into the telescope 10 and be presented for example in the form of a GSM module (for the English acronym Global System for Mobile Communication), allowing it to connect to a telephony communication network mobile.

The telescope 10 can also advantageously incorporate a rechargeable power supply battery 17, so as to make said telescope completely autonomous.

One or more computer applications are stored in the memory or memories 14 and whose instructions, when they are executed by the processing unit 13, make it possible to perform the functionalities described further in the description.
 樓主| 發表於 2022-7-14 20:02:36 | 顯示全部樓層
本帖最後由 WCYue 於 2022-7-14 20:04 編輯

Description translated from Method and system for stereoscopic vision of a celestial observation scene

Identification of stars - F irst embodiment

According to one embodiment, an object recognition computer application is directly implemented in the memory zone 14.

This object recognition computer application is based on an artificial intelligence model. This model can be based on automatic learning algorithms, artificial learning, on a neural network model, on a discriminant analysis model, on a search for isomorphism of graphs or sub-graphs, on a model hidden Markov function and which accepts the image of a star as input, and which generates output data characterizing this star.

According to one embodiment, the learning of the artificial intelligence model is carried out beforehand on a remote computer server. The application based on this model is then implemented later in the memory area 14, for example by downloading. During its implementation in the telescope 10 and/or during the first use of said telescope, the object recognition computer application can be developed to recognize a limited number of stars, for example between 50 and 1000 stars. This list of stars can then be enriched, for example by updates and/or downloads from remote databases in a remote computer server.

In a complementary way, the processing unit 13 can also analyze the digital representation of the scene S for example, by performing a thresholding of said digital representation. The processing unit 1 can still apply filters to highlight details and/or detect the contours of the graphic representations of the stars A1, A2, A3.

According to one embodiment, the stars capable of being recognized can be classified in tables or chained data structures each comprising one or more records of stars. A first structure can comprise one or more records respectively dedicated or associated with digital and/or graphic representations of planets. A second data structure can comprise one or more records respectively dedicated or associated with numerical and/or graphical representations of nebulae. A third structure may include one or more recordings respectively dedicated or associated with digital and/or graphic representations of galaxies. This is the case for each type of star. These different structures may, as a variant, constitute only one single entity.

In the case of the , the three celestial bodies included in the observation scene S are for example the planet Mars (A1), the star Aldebaran or Alpha Tauri (A2) and the Andromeda galaxy (A3). The processing unit 13 processes the image acquired by the sensor 12, so as to identify these three stars by means of the aforementioned object recognition computer application. This identification can thus be carried out in real time, as soon as the observer points the telescope 1 towards the scene S. One or more other stars can be included in the scene S and not be identified by the processing unit 13.
Identification of celestial bodies - Second embodiment

This identification method is based on the technique described in patent documents FR3054897 and/or US2019196173 to which those skilled in the art may refer.

The processing unit 13 is here connected to a database 18 in which are recorded recordings of stars (planets, comets, nebulae, galaxies, celestial objects of the deep sky”). This database 18 can be integrated into the telescope 10. In a variant embodiment, the database 18 is remote from the telescope 10, for example hosted in a remote server to which the processing unit 13 is connected. the processing unit 13 to the database 18 can in this case be implemented by means of the network interface 17, through a communication network.

Each star recording is associated, in the database 18, with one or more characteristic elements of the corresponding star (for example its size, its pattern, its luminosity, etc.) and with location data (or celestial coordinates) said star, in real time.

The observer points the telescope 1 towards an observation scene S. The telescope 1 comprises means for assessing the location of the scene S. These means may consist of terrestrial location means 19 of the telescope 1 and means 20 for determining the orientation of said telescope. The location means 19 are preferably of the type using GPS, EGNOS, WAAS, GALILEO, etc. technology. The means 20 for determining the orientation of the telescope 1, and more particularly of its objective, may consist, by way of nonlimiting examples, of a compass and/or a magnetometer and/or a compass and/or an accelerometer and/or or any other means cooperating with the processing unit 13, and producing data which, taken individually or combined, constitute data making it possible to assess the orientation of the telescope 1. The data collected from the location means 19 and the means 20 for determining the orientation of the telescope 1 are location data used to estimate positioning data of the scene S, that is to say its location in the celestial vault.

According to one embodiment, the detection of characteristic elements is done by thresholding as explained in the aforementioned patent documents FR3054897 and/or US2019196173. When a characteristic element is detected in the digital representation of the image of the scene S, the processing unit 13 searches in the database 18 for a recording of a star associated with a characteristic element similar or resembling the characteristic element detected. This step may consist in generating from the digital representation of the image of the scene S, a polyvector, for example a four-vector, describing an arrangement of stars. The processing unit 13 can then search in the database 18 for a structure of stars describing a polyvector similar to the polyvector identified within the digital representation of the image of the scene S. When this search step attests to the presence of such a recording, the processing unit 13 extracts from this recording the identifiers of the stars which compose it.

In the case of the , the processing unit 13 will identify a trivector formed by the stars A1, A2 and A3 and extract from the database 18 that these stars are respectively Mars, Aldebaran and Andromeda.
Star Identification - Third Embodiment

The database 18 here comprises recordings of stars identified by identification data, which recordings are associated with location data (or celestial coordinates) of said corresponding star, in real time.

The observer selects a star record in the database 18 (for example star A2 which corresponds to Aldebaran). This selection can be made from a man-machine interface, for example from a touch screen integrated in the telescope 1 or from a mobile user terminal, such as an intelligent telephone (Smartphone) or a touch tablet, connected to the treatment 13.

Telescope 1 will then itself point to the selected star. According to one embodiment, the processing unit 13 records a time datum t corresponding to the acquisition period, that is to say at the moment when the observer selects the recording in the database 18. L The processing unit 13 then searches the database 18 for the celestial coordinates of the star selected at time t. Thanks to a correlation of the terrestrial location data of the telescope 1 (supplied for example by the aforementioned terrestrial location means 19) and the orientation data of said telescope (supplied for example by the aforementioned means 20 for determining the orientation of said telescope) , the processing unit 13 actuates an on-board motorized device M making it possible to automatically orient said telescope towards the location of the selected star. The processing unit 13 can thus easily identify Aldebaran A2 in the acquired image of the scene S. The processing unit 13 can in particular actuate the on-board motorized device M so that Aldebaran A2 is in the middle of the image acquired from the scene S, or in another predefined zone of said image. By correlating the identification data of the star selected in the database 18 and the position of this star in the image acquired by the sensor 12, the processing unit 13 easily identifies Aldebaran in said image.

According to one embodiment, the other stars A1 and A3 contained in the image of the scene S are not identified by the processing unit 13. According to a variant embodiment, the other stars A1 and A3 are identified by the processing unit 13 by applying an identification method according to the first embodiment and/or by applying an identification method according to the second embodiment.

It will therefore be understood that, where appropriate, the various aforementioned identification methods can be combined and/or complement each other.
Evaluation of the distances between the identified stars and the telescope – First embodiment

The processing unit 13 is connected to a database 18 in which are recorded recordings of stars (planets, comets, nebulae, galaxies, celestial objects of the deep sky "), each recording of star being associated with data of distance of said star relative to the Earth, or, in other words, data of distance of distance of said star relative to the Earth.

This database 18 can be integrated into the telescope 10 or remote, for example hosted in a remote server to which the processing unit 13 is connected. The connection of the processing unit 13 to the database 18 can in this case be carried out by means of the network interface 17, through a communication network.

For each star identified, the processing unit 13 searches the database 18 for the corresponding record and extracts the corresponding distance datum. This search can be performed via a query request generated and sent by the processing unit 13 to the database 18, and more specifically to its management system.

To return to the aforementioned example, the processing unit 13 having identified the planet Mars (A1), the star Aldebaran (A2) and the Andromeda galaxy (A3), said unit will be able to extract from the database 18 that the distance between planet Earth and Mars is about 78.10 6 km, that the distance between Earth and Aldebaran is about 65.10 13 km, and that the distance between Earth and the Andromeda galaxy is about 2.5.10 19km .

These distance data make it possible to evaluate the distance L A1 , L A2 , L A3 between each identified star and the telescope 1. This evaluated distance can correspond to the distance extracted correlated with terrestrial location data from the telescope 1, for example at means of the location module 19. However, the terrestrial location data of the telescope 1 being very negligible compared to the distance data extracted, the evaluated values of the distances L A1 , L A2 , L A3 correspond to the distance values extracted from the base of data 18.
Evaluation of the distances between the identified stars and the telescope – Second embodiment

In this embodiment, the processing unit 13 evaluates the distance between each identified star and the device without interrogating the database 18.

The processing unit 13 is here adapted to evaluate the distances according to the nature of the stars identified. For example, the processing unit 13 is suitable for determining that a galaxy is further away than a star and that a star is further away than a planet in the solar system. There is therefore no precise evaluation of the distances, but a relative evaluation which leads the processing unit 13 to classify the stars identified in increasing (or decreasing) order of distance from the Earth and in fact by relative to telescope 1 (disregarding the terrestrial location data of said telescope). Returning to the aforementioned example, the processing unit evaluates that L A2 > L A3 > L A1 . This classification method – or relative distance evaluation – can be based on an automatic learning algorithm, implemented in the processing unit 13.
Shift of images of identified stars

The evaluated distances L A1 , L A2 , L A3 are used for the horizontal offset of the images of the stars identified A1, A2, A3, this offset making it possible to obtain a stereoscopic view of the scene S, by parallax effect.

By referring to the , the processing unit 13 generates, for each identified star A1, A2, A3, two homologous images. There are therefore two images for each identified star, respectively a first image I1A1, I1A2, I1A3 and a second image I2A1, I2A2, I2A3. For each pair of images I1A1-I2A1, I1A2-I2A2 and I1A3-I2A3, the first image and the second image are offset horizontally from each other, by a distance DA1, DA2, DA3. This distance is a function of the estimated distance LA1, LA2, LA3 for each star identified: DAi = f(LAi). This offset is for example between 0 mm and 10 mm.

According to one embodiment, the distance D A1 , D A2 , D A3 is inversely proportional to the evaluated distance L A1 , L A2 , L A3 . Also, the further a star is from the telescope 1, the lower the shift between its two images will be. Returning to the aforementioned example, we have: D A1 > D A3 > D A2 . During stereoscopic vision of the scene S, this makes it possible to accentuate the effect of relief on the stars closest to the telescope 1 and to attenuate this effect of relief on the most distant stars. One thus obtains a highlighting of the stars A1, A2, A3 which corresponds to the astronomical reality from the reference frame of the Earth.

According to a variant embodiment, the distance D A1 , D A2 , D A3 is proportional to the evaluated distance L A1 , L A2 , L A3 . In this case, the closer a star is to the telescope 1, the lower the shift between its two images will be. Returning to the aforementioned example, we then have: D A2 > D A3 > D A1 . The relief effect is thus accentuated on the stars furthest from the telescope 1 and this relief effect is attenuated on the closest stars.

In the above example, the evaluated distances LA1, IA2, IA3have very different orders of magnitude: 78.106Km for March, 65.1013Km for Aldebaran, and 2.5.1019Km for Andromeda. In this case, the processing unit 13 preferentially applies a logarithmic function to determine the value of the offsets DA1, DA2, DA3 from the estimated distances LA1, IA2, IA3.

Consider another example where scene S is centered on Jupiter (A2) and two of its satellites, for example Europa (A1) and Io (A3). The distance between Earth and Jupiter (A2) is approximately 624.10 km. Io (A3) is located approximately 421,800 km from Jupiter and Europa (A1) is 671,100 km away. In this case, it can be considered that the evaluated distances L A1 , L A2 , L A3 have close orders of magnitude. The processing unit 13 then applies a linear function to determine the value of the offsets D A1 , D A2 , D A3 from the evaluated distances L A1 , L A2 , L A3 .

More generally, the mathematical function making it possible to determine the value of the offsets D Ai from the evaluated distances L Ai , varies according to the standard deviation of said evaluated distances.

When the distances L A1 , L A2 , L A3 are evaluated relatively, the processing unit 13 can shift the first image I 1A i and the second image I 2A i , by an identified star Ai by a fixed value D A i , which value depends on the classification of said star. Thus, taking the aforementioned example, for the most distant star A2, a first offset value is applied (for example D A2 = 0.1 mm), for the intermediate star A3, a second offset value is applied (for example D A3 = 1 mm), and for the closest star A1, a third offset value is applied (for example D A1 = 3 mm). These offset values can be predetermined and/or configurable.
 樓主| 發表於 2022-7-14 20:05:53 | 顯示全部樓層
本帖最後由 WCYue 於 2022-7-14 20:31 編輯

Description translated from Method and system for stereoscopic vision of a celestial observation scene

Image Display - First Embodiment

On the , the telescope 1 incorporates a viewing device formed by two eyepieces, respectively a first eyepiece 100D and a second eyepiece 100G, installed on the tube 10. The first eyepiece 100D is the right eyepiece, in front of which the observer places his right eye OD. The second eyepiece 100G is the left eyepiece, in front of which the observer places his left eye OG.

By referring to the , each eyepiece 100D, 100G is provided with a display screen, respectively a first 101D and a second screen 101G. The processing unit 13 is suitable for displaying the first images I1A1, I1A2, I1A3 on the first screen 101D and displaying the second images I2A1, I2A2, I2A3 on the second screen 101G. The first images I1A1, I1A2, I1A3 being shifted from the second images I2A1, I2A2, I2A3, the observer obtains a stereoscopic vision of the scene S when he positions his eyes OD, OG in front of each of the eyepieces 100D, 100G.

According to one embodiment, the screens 101D, 101G are flat screens, for example polychrome LCD (for Liquid Crystal Display) or OLED (for Organic Light-Emitting Diode) liquid crystal screens. The active face of each screen 101D, 101G emerges at the level of an opening or window made in the corresponding eyepiece 100D, 100G to be accessible to the eye of the user. “ Active face ” means the face on which the images are displayed.

According to another embodiment, the two eyepieces 100D, 100G each equipped with a screen, are integrated into a display device which is remote from the telescope 1. immersive helmet, helmet-screen or helmet HMD for the English acronym of head-mounted display) of the type described in the patent document EP3400474 (GOOGLE). The connection between the head-mounted display and the telescope 1, and more particularly with the processing unit 13, can be made via a wired link (for example by means of a USB cable) or via a wireless link, for example according to a proximity communication protocol, such as, by way of non-limiting example, the Bluetooth®, Wifi®, ZigBee® protocol.

Image Display - Second Embodiment

In this embodiment, the first images I 1A1 , I 1A2 , I 1A3 and the second images I 2A1 , I 2A2 , I 2A3 are displayed in the form of an anaglyph image. An anaglyph image includes two colored images that are filtered differently at each viewer's eye. The latter can visualize the image thus formed through anaglyphic filters placed in front of each eye. Anaglyph filters can be of different colors (for example, chromatically opposite colors). For example, the anaglyph image of scene S may include a first red filtered image and a second cyan filtered image. A red color filter can be placed in front of the observer's right eye to allow him to view the red filtered image. And a cyan color filter can be placed in front of the viewer's left eye to allow them to view the cyan filtered image. Filters of other colors can be used.

Referring to Figures 5 and 6, the processing unit 13 displays on a screen 100, the anaglyph image I of the scene S in the form of two images superimposed and offset from each other. The first image is formed by the first images I1A1, I1 TO 2, I1A3stars A1, A2, A3 and the second image is formed by the second images I2A1, I2A2, I2A3of the said stars. The first images I1A1, I1 TO 2, I1A3and the second images I2A1, I2A2, I2A3are shifted as previously described. The first image (containing the first images I1A1, I1 TO 2, I1A3) is generated by the processing unit 13 by applying to it a first filter (for example a cyan filter) and the second image (containing the second images I2A1, I2A2, I2A3) by applying another filter of chromatically opposite color to it (for example a red filter).

For observation, analytic filters FD, FG are placed in front of the eyes OD, OG of the observer. These filters FD, FG can for example be worn by glasses. The right filter FD placed in front of the right eye OD is for example a cyan color filter to allow viewing of the first image filtered in cyan. And the left filter FG placed in front of the left eye OG is for example a red color filter to allow viewing of the second image filtered in red. The image seen by the right eye OD is thus separated from that seen by the left eye OG. The observer's brain then recombines the two images to obtain a relief view of the scene S.

According to one embodiment, the images are displayed simultaneously on the screen 100. According to a variant embodiment, the two images are displayed successively alternately at high frequency on the screen 100. The observer is in this case equipped with glasses (provided with filters FD, FG) which, at the frequency of alternating images, alternately mask the view of the right eye OD and of the left eye OG according to the image displayed.

According to another embodiment, the anaglyph image I is displayed on a screen of a head-mounted display of the type described in the aforementioned patent document EP3400474. According to another embodiment, each colored image of the anaglyph image I is displayed on a screen of an eyepiece of a head-mounted display with two eyepieces of the type described in the aforementioned patent document EP3400474.

According to another embodiment illustrated in the , the anaglyph image I is displayed on a screen 100 of a mobile terminal T, for example the screen of a Smartphone (smart telephone) or of a touch pad. The connection between the processing unit 13 and the screen 100 can be made via a wired link (for example by means of a USB cable) or via a wireless link, for example according to a proximity communication protocol, such as by way of non-limiting example, the Bluetooth®, Wifi®, ZigBee® protocol.
Reference plane

To improve the perception of relief of stars A1, A2, A3, it is advantageous to define a reference plane. It is thus possible to produce an emergence or springing effect for the stars identified located in front of this reference plane. The observer will perceive these stars as floating in front of the screen or coming out of the screen on which the images are displayed. Conversely, for the identified stars located behind this reference plane, the observer will perceive them in retreat, with a depth effect.

Thus, for an identified star Ai located behind the reference plane, the processing unit 13 will shift the first image I 1Ai and the second image I 2Ai so that said images have a positive parallax. For an identified star Ai located in front of the reference plane, the processing unit 13 will shift the first image I 1Ai and the second image I 2Ai so that said images have a negative parallax. And for an identified star located in the reference plane, the processing unit 13 will not shift the first image I 1Ai from the second image I 2Ai so that said images have zero or zero parallax.

According to one embodiment, the reference plane Pref is defined at infinity. Therefore, for each identified star Ai, the processing unit 13 shifts the first image I 1Ai and the second image I 2Ai so that said images have a negative parallax. The observer then perceives the stars A1, A2, A3 as emerging or springing from the observed scene.

According to another embodiment illustrated in the , the reference plane Pref coincides with an identified star, here star A3. For this star A3, the processing unit 13 does not shift the first image I1A3 from the second image I2A3 so that said images have zero or zero parallax. The observer will perceive this star as located in the plane of the display screen or of the observed scene. For the star(s) A2 located behind this plane Pref, i.e. for the star(s) A2 whose estimated distance LA2 is greater than the estimated distance LA3 of the reference star A3, then the unit of processing 13 shifts the first image I1A2 and the second image I2A2 so that said images have a positive parallax. The observer perceives this or these stars A2 set back, with a depth effect. And for the star(s) A1 located in front of this plane Pref, i.e. for the star(s) A1 whose estimated distance LA1 is less than the estimated distance LA3 of the reference star A3, then the unit processing 13 shifts the first image I1A1 and the second image I2A1 so that said images have a negative parallax. The observer perceives this or these stars A1 as emerging or springing from the observed scene.

In this embodiment, the choice of the reference plane Pref, and therefore of the reference star A2, can be chosen by the observer, for example by means of a dedicated user interface, for example from an integrated touch screen in the telescope 1 or from a mobile user terminal, such as an intelligent telephone (Smartphone) or a touch pad, connected to the processing unit 13. According to another embodiment, in the case where the observer selects a recording of a star in the database 18, this selected star can be considered as the reference star by the processing unit 13. According to yet another embodiment, the reference star is automatically selected by the processing unit 13 taking as selection criterion the evaluated distances L A1 , L A2 , L A3 . For example, the processing unit 13 can select as reference star the one whose evaluated distance is closest to the average of all the evaluated distances. The processing unit 13 can also select as reference star the one whose evaluated distance is the greatest or, on the contrary, the shortest.

The illustrates the images displayed in each of the 100D, 100G eyepieces of the telescope of the , when the reference plane Pref coincides with the star A3 ( ). The center of each star identified A1, A2, A3, as they appear on the image of the scene S acquired by the sensor 12, is referenced respectively CA1, CA2, CA3.

For the body A2 located behind the reference plane Pref, the positive parallax can be obtained by shifting the first image I 1A2 to the right in the right image (i.e. the image displayed on the screen 101D of the right eyepiece OD) and shifting the second I 2 A2 image to the left in the left image (i.e. the image displayed on the screen 101G of the left eyepiece OG ). The binocular focal plane is then behind the display. The shift of the first image I 1A2 and of the second image I 2A2 with respect to the center C A2 can be 1/2×D A2 so that the total shift between the two images is D A2 . It is also possible to obtain positive parallax by shifting the first image I 1A2 only to the right by a distance D A2 in the right-hand image and by not shifting the second image I 2A2 in the left-hand image. Similarly, the positive parallax can be obtained by shifting the second image I 2A2 only to the left by a distance D A2 in the left image and by not shifting the first image I 1A2 in the right image.

For the star A1 located in front of the reference plane Pref, the negative parallax can be obtained by shifting the first image I 1A 1 to the left in the right image and by shifting the second image I 2A 1 to the right in the image on the left, so that the binocular focal point is then in front of the display. As explained previously, the shift of the first image I 1A 1 and of the second image I 2A 1 with respect to the center C A 1 can be 1/2xD A 1 so that the total shift between the two images is D A 1 . It is also possible to obtain the negative parallax by only shifting the first image I 1A1 to the left by a distance D A 1 in the right image and by not shifting the second image I 2A2 in the left image. Similarly, the negative parallax can be obtained by only shifting the second image I 2A1 to the right by a distance D A 1 in the left image and by not shifting the first image I 1A 1 in the right image .

For the star A3 located in the reference plane, the first image I 1A3 and the second image I 2A3 are not shifted from the center C A3 so as to obtain zero or zero parallax. The binocular focal plane is on the display.

The illustrates the case where the first images I1A1, I1A2, I1A3 and the second images I2A1, I2A2, I2A3 are displayed on the screen 100 in the form of an anaglyph image I. To illustrate the position of the different images, the reference plane Pref coincides with the star A3 ( ). As previously described with reference to , the center of each star identified A1, A2, A3, as they appear on the image of the scene S acquired by the sensor 12, is referenced respectively CA1, CA2, CA3.

For explanatory purposes only, it is assumed that the first images I 1A1 , I 1A2 , I 1A3 are generated by applying a cyan filter to the image acquired by the sensor 13, and the second images I 2A1 , I 2A2 , I 2A3 are generated by applying a red filter to said acquired image. The right filter FD placed in front of the right eye is a cyan color filter to enable the first images I 1A1 , I 1A2 , I 1A3 to be viewed. And the left filter FG placed in front of the left eye is a red filter to enable the second images I 2A1 , I 2A2 , I 2A3 to be viewed. Those skilled in the art will understand that other combinations of chromatically opposite colors can be envisaged.

For the star A2 located behind the reference plane Pref, the positive parallax can be obtained by shifting the first image I 1A2 to the right and by shifting the second image I 2A2 to the left. When the observer views the image I through the filters FD, FG, the binocular focal plane is then behind the screen 100. The shift of the first image I 1A2 and of the second image I 2A2 with respect to the center C A2 can be 1/2xD A2 so that the total offset between the two images is D A2 . It is also possible to obtain the positive parallax by shifting the first image I 1A2 only to the right by a distance D A2 and by not shifting the second image I 2A2 . Likewise, the positive parallax can be obtained by shifting the second image I 2A2 only to the left by a distance D A2 and by not shifting the first image I 1A2 .

For the star A1 located in front of the reference plane Pref, the negative parallax can be obtained by shifting the first image I 1A1 to the left and by shifting the second image I 2A1 to the right, so that when the observer views the image I through the filters FD, FG, the binocular focal plane is then in front of the screen 100. As explained previously, the shift of the first image I 1A1 and of the second image I 2A1 with respect to the center C A1 can be 1/2xD A1 so that the total offset between the two images is D A1 . It is also possible to obtain the negative parallax by only shifting the first image I 1A1 to the left by a distance D A1 and by not shifting the second image I 2A2 . Similarly, the negative parallax can be obtained by shifting the second image I 2A1 only to the right by a distance D A1 and by not shifting the first image I 1A1 .

For the star A3 located in the reference plane, the first image I 1A3 and the second image I 2A3 are not shifted from the center C A3 and are merged, so as to obtain zero or zero parallax. The binocular focal plane is on screen 100.
Other embodiments of the optical system of the device .

In FIGS. 1 and 7, the optical system is formed by a mirror 11 placed inside the tube 10 and centered on the optical axis.

On the , the optical system is formed by:
- a primary mirror 110 positioned in the tube 10, to reflect the light rays entering said tube,
- a secondary mirror 111 positioned in the tube 10 to reflect the light rays reflected by the primary mirror 110.

The primary 110 and secondary 111 mirrors are positioned on the same X-X optical axis. They are arranged so that the light rays reflected by said mirrors form, in a focal plane Pf, an image of the scene S, which focal plane is perpendicular to the optical axis X-X. The mirrors 110, 111 are arranged so that the focal plane Pf is located in the tube 10, between the two said mirrors. The sensor 12 is then arranged on the X-X optical axis, in the focal plane Pf, for the acquisition of the image of the observation scene.

The primary mirror 110 is preferably a concave parabolic mirror having a low focal ratio (preferably less than 5). This type of mirror eliminates spherical aberrations. The diameter of the primary mirror 110 corresponds substantially to the diameter of the tube 10.

Compared to the device of the , with equivalent diameter and focal length of the primary mirror, the fact of bringing the focal plane F between the two mirrors 110, 111 makes it possible to reduce the focal length of the optical system and the length of the telescope 1. The magnification of the stars observed is lower, but with the benefit of an increase in the visual field and an increase in image brightness. We can thus observe faint stars such as nebulae or galaxies with better image quality. The image quality of luminous bodies such as planets or stars remains very good.

The display of images to obtain stereoscopic vision is carried out according to one of the display modes described previously (display in two eyepieces, on a single screen, etc.).

The arrangement of the various elements and/or means and/or steps of the invention, in the embodiments described above, should not be understood as requiring such an arrangement in all implementations. Other variants may be provided, in particular:
- The optical system of the device does not necessarily consist of one or more mirrors, but may include one or more lenses in addition to or in substitution for said mirrors.
- Other types of sensors 12 can be envisaged. For example a sensor of the CCD, CMOS, or Foveon type, color or black and white.

Further, one or more features disclosed only in one embodiment may be combined with one or more other features disclosed only in another embodiment. Similarly, one or more features disclosed only in one embodiment may be generalized to other embodiments, even if such feature or features are described only in combination with other features.
 樓主| 發表於 2022-7-14 20:07:37 | 顯示全部樓層
Description translated from Method and system for stereoscopic vision of a celestial observation scene

Claims

Method for stereoscopic vision of a celestial observation scene (S), the method comprising the steps consisting in:
- acquiring an image of the observation scene (S) by means of an optical vision device (1),
- generate two homologous images of one or more stars (A1, A2, A3) contained in the observation scene (S), respectively a first image (I 1A1 , I 1A2 , I 1A3 ) and a second image (I 2A1 , I 2A2 , I 2A3 ), said first image and said second image being offset from each other,
- showing the first images (I 1A1 , I 1A2 , I 1A3 ) and the second images (I 2A1 , I 2A2 , I 2A3 ) so as to obtain a stereoscopic view of the observation scene (S),
this characterized by the fact that the method comprises the following steps:
- identify one or more stars (A1, A2, A3) in the acquired image of the observation scene (S),
- evaluate the distance (L A1 , L A2 , L A3 ) between each identified star (A1, A2, A3) and the device (1),
- for one or more identified stars (A1, A2, A3), horizontally shift the first image (I 1A1 , I 1A2 , I 1A3 ) and the second image (I 2A1 , I 2A2 , I 2A3 ) by a distance (D A1 , D A2 , D A3 ) which is a function of the evaluated distance (L A1 , L A2 , L A3 ) of said star. Method according to claim 1, in which the identification of a star in the image of the observation scene (S) is carried out by means of a computer application for recognition of objects. A method according to claim 1, comprising the steps of:
- recording in a database (18) star records, each star record being selectable and associated with celestial coordinates of said star in real time and with identification data,
- select a star record in the database (18),
- actuating an orientation device (M) of the apparatus (1) so that said apparatus is oriented towards said star corresponding to the selected recording, according to the celestial coordinates associated with said recording. A method according to claim 3, comprising the steps of:
- activating the orientation device (M) so that the body corresponding to the selected recording is in a predefined zone of the acquired image of the observation scene (S),
- identifying said star by correlating the identification data associated with the selected recording and the position of this star in the acquired image of the observation scene (S). Method according to one of Claims 3 or 4, in which the orientation of the device (1) is also carried out by correlating terrestrial location data of said device (1) and orientation data of said device. Method according to one of Claims 3 to 5, comprising a step consisting in identifying one or more other stars in the image of the observation scene (S), which identification is carried out by means of a computer application for recognition of 'objects. Method according to one of Claims 1 to 6, comprising the steps consisting in:
- recording in a database (18) star recordings, each star recording being associated with distance data of said star relative to the Earth,
- for each identified star (A1, A2, A3), search the database (18) for the corresponding record and extract the distance data,
- evaluating the distance (L A1 , L A2 , L A3 ) between each identified star (A1, A2, A3) and the device (1), according to the distance datum extracted. Method according to one of Claims 1 to 7, comprising a step consisting in applying a mathematical function to determine the value of the offset distances of the images (D A1 , D A2 , D A3 ) from the evaluated distances (L A1 , L A2 , L A3 ), which function varies according to the standard deviation of said evaluated distances. Method according to one of Claims 1 to 6, comprising a step consisting in evaluating in a relative manner the distance (L A1 , L A2 , L A3 ) between each identified star (A1, A2, A3) and the device (1) so as to classify said identified stars in increasing or decreasing order of distance from said device (1), which relative evaluation is carried out according to the nature of said identified stars. Method according to claim 9, consisting in shifting the first image (I 1A1 , I 1A2 , I 1A3 ) and the second image (I 2A1 , I 2A2 , I 2A3 ) of an identified star by a fixed value (D A1 , D A2 , D A3 ), which value depends on the classification of said star. Method according to one of the preceding claims, comprising the steps consisting in:
- define a reference plane (Pref),
- for an identified body (A2) located behind the reference plane (Pref), shift the first image (I 1A2 ) and the second image (I 2A2 ) so that said images have a positive parallax,
- for an identified star (A1) located in front of the reference plane, shift the first image (I 1A1 ) and the second image (I 2A1 ) so that said images have a negative parallax,
- for an identified star (A3) located in the reference plane (Pref), not shifting the first image (I 1A3 ) and the second image (I 2A3 ) so that said images have zero parallax. Method according to claim 11, consisting in defining the reference plane (Pref) at infinity. Method according to claim 11, consisting in making the reference plane (Pref) coincide with an identified star (A2). Method according to one of Claims 1 to 13, comprising the step consisting in displaying the first images (I 1A1 , I 1A2 , I 1A3 ) on a first screen (101D) installed in a first eyepiece (100D) of a device viewing device integrated or connected to the device (1) and displaying the second images (I 2A1 , I 2A2 , I 2A3 ) on a second screen (101G) installed in a second eyepiece (100G) of said viewing device, so as to obtaining a stereoscopic view of the observation scene when an observer positions his eyes (OD, OG) in front of each of said eyepieces. Process according to one of Claims 1 to 13, comprising the steps consisting in:
- displaying on a screen (100) the first images (I 1A1 , I 1A2 , I 1A3 ) and the second images (I 2A1 , I 2A2 , I 2A3 ) in the form of an anaglyph image (I),
- placing anaglyphic filters (FD, FG) in front of the eyes (OD, OG) of the observer so that said observer has a stereoscopic vision of the observation scene when he looks at the screen (100). System for stereoscopic vision of a celestial observation scene (S), the system comprising:
- an optical vision device (1) adapted to acquire an image of the observation scene (S),
- a means (13) for generating two homologous images of one or more celestial bodies (A1, A2, A3) contained in the observation scene (S), respectively a first image (I 1A1 , I 1A2 , I 1A3 ) and a second image (I 2A1 , I 2A2 , I 2A3 ), said first image and said second image being offset from each other,
- a means for displaying the first images (I 1A1 , I 1A2 , I 1A3 ) and the second images (I 2A1 , I 2A2 , I 2A3 ) so as to obtain a stereoscopic view of the observation scene (S),
this characterized by the fact that the apparatus (1) comprises a processing unit (13) suitable for:
- identify one or more stars (A1, A2, A3) in the acquired image of the observation scene (S),
- evaluate the distance (L A1 , L A2 , L A3 ) between each identified star (A1, A2, A3) and the device (1),
- for one or more identified stars (A1, A2, A3), horizontally shift the first image (I 1A1 , I 1A2 , I 1A3 ) and the second image (I 2A1 , I 2A2 , I 2A3 ) by a distance (D A1 , D A2 , D A3 ) which is a function of the evaluated distance (L A1 , L A2 , L A3 ) of said star. A system according to claim 16, wherein:
- the apparatus (1) incorporates or and connected to a viewing device comprising two eyepieces, respectively a first eyepiece (100D) and a second eyepiece (100G), each eyepiece being provided with a viewing screen (101D, 101G) ,
- the processing unit (13) is adapted to display the first images (I 1A1 , I 1A2 , I 1A3 ) on the screen (101D) of the first eyepiece (100D) and display the second images (I 2A1 , I 2A2 , I 2A3 ) on the screen (101G) of the second eyepiece (100D). System according to Claim 16, in which the apparatus (1) comprises a screen (100) on which the first images (I 1A1 , I 1A2 , I 1A3 ) and the second images (I 2A1 , I 2A2 , I 2A3 ) are displayed as an anaglyph image (I). System according to one of Claims 16 to 18, in which the apparatus (1) is a telescope.

 樓主| 發表於 2022-10-9 12:01:36 | 顯示全部樓層
Utilizing a Global Network of Telescopes to Update the Ephemeris for the Highly Eccentric Planet HD 80606 b and to Ensure the Efficient Scheduling of JWST

Kyle A. Pearson et al. Published by The Astronomical Journal, 164:178 (15pp), 2022 November

https://doi.org/10.3847/1538-3881/ac8dee
 樓主| 發表於 2022-10-10 01:47:14 | 顯示全部樓層
本帖最後由 WCYue 於 2022-10-10 02:11 編輯

Method for producing a digital image, associated computer program product and optical system

https://patentimages.storage.goo ... US20190196173A1.pdf

Abstract

The invention relates to an optical system for restoring a natural image combined with a digital image, in order to characterise and highlight the objects represented on the natural image. The optical system includes an objective lens, an eyepiece, a semi-reflective plate, a processing unit, capturing means and restoring means. The invention also relates to a method for producing such a digital image.


Worldwide applications
2016  FR 2017  WO EP US 2021  US US
Application US16/322,226 events
2016-08-05
Priority to FR1657596
2016-08-05
Priority to FR1657596A
2017-08-04
Application filed by Unistellar
2017-08-04
Priority to PCT/FR2017/052203
2019-01-31
Assigned to UNISTELLAR
2019-06-27
Publication of US20190196173A1
2021-11-23
Application granted
2021-11-23
Publication of US11181729B2
Status
Active
2038-02-02
Adjusted expiration
您需要登錄後才可以回帖 登錄 | 申請討論區帳戶

本版積分規則

Archiver|手機版|小黑屋|香港天文學會

GMT+8, 2024-4-16 18:42 , Processed in 0.018932 second(s), 16 queries .

Powered by Discuz! X3.5

© 2001-2024 Discuz! Team.

快速回復 返回頂部 返回列表