Sponsored

Cool Rivian Patent/TradeMark Info

kanundrum

Well-Known Member
Joined
May 2, 2020
Threads
217
Messages
3,976
Reaction score
12,107
Location
Washington, DC
Vehicles
Giulia QV, R1S (S00N)
Occupation
IT
Clubs
 
All Public Information. I also have change alerts setup for their website in the event any major changes occur ahaha. I maybe going too deep!

Rivian R1T R1S Cool Rivian Patent/TradeMark Info 1590516290660


Rivian R1T R1S Cool Rivian Patent/TradeMark Info 1590516123467
Sponsored

 

skyote

Well-Known Member
Joined
Mar 12, 2019
Threads
55
Messages
2,725
Reaction score
5,647
Location
Austin, TX
Vehicles
Jeeps, 2500HD Duramax, R1S Preorder (Dec 2018)
I need more info on Rivian Elevation!
 
OP
OP
kanundrum

kanundrum

Well-Known Member
Joined
May 2, 2020
Threads
217
Messages
3,976
Reaction score
12,107
Location
Washington, DC
Vehicles
Giulia QV, R1S (S00N)
Occupation
IT
Clubs
 
I need more info on Rivian Elevation!
It goes super deep and gets me excited, not related to sound but infotainment anyways. Its Long so if you want to see it click below.



DETAILED DESCRIPTION
In accordance with various embodiments, mechanisms (which can include methods, systems, and media) for controlling access to vehicle features are provided.

In some embodiments, the mechanisms described herein can determine whether an autonomous or semi-autonomous feature of a vehicle can be activated by a driver of the vehicle. In some embodiments, an autonomous or semi-autonomous feature can be any suitable feature that automates steering of a vehicle, acceleration/deceleration of a vehicle, and/or any other suitable function of a vehicle. For example, in some embodiments, an autonomous or semi-autonomous feature of a vehicle can control steering of the vehicle if the vehicle begins to drift out of a lane, can cause the vehicle to brake in response to detecting an object in front of the vehicle, can adjust a speed of the vehicle while the vehicle is utilizing cruise control, can park the vehicle, can control steering and/or acceleration/deceleration of the vehicle while driving in traffic, and/or can perform any other suitable autonomous or semi-autonomous function.

In some embodiments, in response to receiving an indication from a driver of a vehicle that the driver wants to activate a particular autonomous or semi-autonomous feature, the mechanisms can determine whether the driver is qualified to use the indicated feature. In some embodiments, in response to determining that the driver is not qualified to activate the indicated feature, the mechanisms can cause the feature to be inhibited or to remain inactivated. Conversely, in some embodiments, in response to determining that the driver is qualified to activate the indicated feature, the mechanisms can cause the feature to be activated.

In some embodiments, the mechanisms can determine whether a driver of a vehicle is qualified to activate a particular autonomous or semi-autonomous feature using any suitable technique or combination of techniques. For example, in some embodiments, the mechanisms can determine whether the driver is included in a group of drivers who are qualified to activate the feature. As a more particular example, in some embodiments, the mechanisms can determine whether a driver is included in the group of drivers who are qualified to activate the feature based on any suitable information or techniques, such as by determining whether an identifier associated with the driver is included in a group of identifiers corresponding to drivers qualified to activate the feature. As a specific example, in some embodiments, the mechanisms can determine whether an image capturing a face of a driver is included in a group of images of faces of qualified drivers. As another specific example, in some embodiments, the mechanisms can determine identifying information corresponding to the driver based on a key fob used to access the vehicle, and can determine whether a driver associated with the identifying information is qualified to activate the feature. As another example, in some embodiments, the mechanisms can administer a test related to the autonomous or semi-autonomous feature to a driver. As a more particular example, in some embodiments, the mechanisms can present one or more user interfaces that include questions relevant to the feature (e.g., what road conditions the feature can be used while driving on, what weather conditions are required for sensors of the vehicle to provide accurate information while using the feature, and/or any other suitable questions), and can determine that the driver is qualified to activate the feature if the driver answers more than a predetermined percentage (e.g., more than 70%, and/or any other suitable percentage) of the questions correctly). As yet another example, in some embodiments, the mechanisms can determine whether a driver has been presented with particular information related to the feature.

In some embodiments, the mechanisms can additionally present information related to an autonomous or semi-autonomous feature of a vehicle. In some embodiments, the information can include information to be presented while a vehicle is stationary and/or information that is to be presented while a vehicle is in motion and while a feature is activated (that is, while the feature is being used in a restricted mode), as described in more detail in connection with FIG. 2. In some embodiments, information presented while a vehicle is stationary can include information such as information related to settings associated with a feature, information indicating techniques to activate a feature, information indicating warnings associated with a feature, and/or any other suitable information. In some embodiments, information presented while a vehicle is in motion and while a feature is activated can include information such as indications of objects detected by sensors of the vehicle in connection with use of the feature, demonstrations of consequences of changing settings associated with the feature, and/or any other suitable information. In some embodiments, the mechanisms can cause an indication that a driver has been presented with all information related to a particular feature to be stored. In some such embodiments, the mechanisms can then allow the feature to be used in an operational mode, as described above.

Turning to FIG. 1, an example 100 of a process for controlling access to vehicle features is shown in accordance with some embodiments of the disclosed subject matter. In some embodiments, blocks of process 100 can be implemented on a computer associated with a vehicle, as shown in and described below in connection with FIG. 3.

Process 100 can begin by identifying a driver of a vehicle at 102. In some embodiments, an identity of the driver can be determined in any suitable manner and based on any suitable information. For example, in some embodiments, the driver can log into a user account corresponding to the driver, wherein the user account is associated with a manufacturer of the vehicle, an entity providing the vehicle (e.g., a rental car company, etc.), and/or any other suitable entity. As another example, in some embodiments an identity of the driver can be determined based on information associated with a key fob used to unlock and/or start the vehicle. As a more particular example, in some embodiments, an identity of the driver can be determined based on an identifier used by the key fob to unlock and/or start the vehicle (e.g., a personal identification number, or PIN, and/or any other suitable identifier). As yet another example, in some embodiments, an identity of the driver can be determined by capturing an image of a face of the driver and using any suitable image recognition techniques or facial recognition techniques to identify the driver. As still another example, in some embodiments, an identity of the driver can be determined using any other suitable biometric information (e.g., a fingerprint, and/or any other suitable biometric information). As still another example, in some embodiments, an identity of the driver can be determined using information from the driver's phone or other mobile device (e.g., via a BLUETOOTH connection between the phone and the vehicle, and/or in any other suitable manner).

At 104, process 100 can receive an indication that the driver of the vehicle wants to activate an autonomous or semi-autonomous feature of the vehicle. For example, in some embodiments, an autonomous or semi-autonomous feature of the vehicle can include any suitable feature where the vehicle automatically adjusts speed and/or steering of the vehicle. As a more particular example, in some embodiments, an autonomous or semi-autonomous feature of the vehicle can include adaptive cruise control, lane keeping assistance, automatic steering and speed control in particular conditions (e.g., in traffic driving below a predetermined speed limit, while driving on a highway, and/or any other suitable conditions), and/or one or more other suitable automated or semi-automated features. In some embodiments, process 100 can receive the indication that the driver of the vehicle wants to activate a particular feature in any suitable manner. For example, in some embodiments, process 100 can receive the indication by determining that a particular button (e.g., located on a steering wheel of the vehicle, located on a dashboard of the vehicle, and/or any other suitable button) has been pressed. As another example, in some embodiments, process 100 can determine that a selectable input to activate the feature has been selected, for example, on a user interface presented on a display of the vehicle (e.g., a dashboard display, and/or any other suitable display).

At 106, process 100 can determine whether the driver of the vehicle is qualified to activate the feature indicated at block 104. Additionally or alternatively, in some embodiments, process 100 can determine whether the feature has been updated and/or changed since a driver was last indicated as being qualified to activate the feature. In some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 using any suitable technique or combination of techniques. For example, in some embodiments, process 100 can determine whether identifying information corresponding to the driver is included in a database or list of drivers qualified to activate a particular feature. As a more particular example, in some embodiments, process 100 can use identifying information such as an image of a face of the driver, an identifier associated with a key fob used to access the vehicle, identifying information related to a BLUETOOTH connection between a mobile device of the driver and the vehicle, and/or any other suitable identifying information as well as an indication of the feature as inputs to a database that indicates whether the driver corresponding to the identifying information is qualified to activate the indicated feature.

As another example, in some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 by administering a test to the driver prior to allowing the driver to activate the feature. As a more particular example, in some embodiments, process 100 can present one or more questions (e.g., via user interfaces presented on a display of the vehicle) that are related to the feature. As a specific example, in some embodiments, process 100 can present a question that asks about road conditions (e.g., whether the feature is to be used only on a highway, whether the feature is to be used only while driving below a predetermined speed limit, and/or any other suitable road conditions) the feature is to be used with. As another specific example, in some embodiments, process 100 can present a question that asks about weather conditions (e.g., whether the feature can be used in light rain, whether the feature can be used in heavy fog, whether the feature can only be used on a clear day, and/or any other suitable weather conditions) the feature is to be used during. As yet another specific example, in some embodiments, process 100 can present a question that asks about a level of attention required of the driver while using the feature (e.g., whether the driver can safely take their hands off the steering wheel, whether the driver must be prepared to respond to traffic lights, and/or any other suitable question related to a level of attention). In some embodiments, answers to the questions can be received in any suitable manner, such as via a selection of one answer from a group of potential answers, via a spoken answer received by a microphone associated with the vehicle, and/or in any other suitable manner. In some embodiments, process 100 can determine that the driver of the vehicle is qualified to activate the feature if the driver responds correctly to more than a predetermined number or percentage of questions. Conversely, in some embodiments, process 100 can determine that the driver of the vehicle is not qualified to activate the feature if the driver responds correctly to fewer than a predetermined number or percentage of questions. In some such embodiments, process 100 can present a message indicating available information related to the feature, as described in more detail below in connection with FIG. 2.

As yet another example, in some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 by determining whether the driver has previously been presented with information related to the feature. In some such embodiments, process 100 can determine that the driver is qualified to access the feature if the driver been presented with all of the available information (or more than a particular percentage of the available information) related to the feature. Conversely, in some embodiments, process 100 can determine that the driver is not qualified to access the feature if the driver has not been presented with all of the information related to the feature (or has been presented with less than a particular percentage of the available information). Additionally or alternatively, in some embodiments, process 100 can determine that the driver is not qualified to activate the feature if the feature has been changed and/or updated in any suitable manner since the driver was previously presented with the information. In some embodiments, process 100 can determine whether the driver has been presented with information related to the feature using any suitable technique or combination of techniques. For example, in some embodiments, process 100 can query a database using the identifying information corresponding to the driver (e.g., as described above in connection with block 102) and the indication of the feature selected at block 104. In some such embodiments, the database can store indications of drivers who have previously been presented with information related to different autonomous or semi-autonomous features.

If, at 106, process 100 determines that the driver is not qualified to access the feature (“no” at 106), process 100 can proceed to block 108 and can inhibit activation of the feature. For example, in an instance where the feature indicated at block 104 uses automated steering of the vehicle, process 100 can inhibit automation of steering, thereby requiring that the driver of the vehicle maintain control of steering of the vehicle. As another example, in an instance where the feature indicated at block 104 uses automated acceleration/deceleration of the vehicle, process 100 can inhibit automation of acceleration/deceleration, thereby requiring that the driver of the vehicle maintain control of acceleration/deceleration of the vehicle.

Process 100 can proceed to block 110 and can present a message indicating that the feature has been blocked. In some embodiments, the message can indicate any suitable information. For example, in some embodiments, the message can indicate that the driver has not been recognized as a driver qualified to use the feature based on the identifying information. As another example, in some embodiments, the message can indicate that the driver did not pass a test related to the feature (e.g., as described above in connection with block 106). As yet another example, in some embodiments, the message can indicate that the feature has been blocked because the driver has not yet been presented with all available information (or has not yet been presented with more than a particular percentage of available information) related to the feature. As still another example, in some embodiments, the message can indicate that aspects of the feature have changed and/or have been updated in any suitable manner, and that the feature has been blocked because the driver has not been indicated as qualified to use the feature since the feature was updated. As a more particular example, in some embodiments the message can indicate a particular aspect of the feature that has been changed (e.g., that particular settings have changed, that particular warning tones or indications have changed, and/or any other suitable changes). In some embodiments, the message can additionally include information related to use of the feature. For example, in some embodiments, the message can include a selectable input, that, when selected, causes an information presentation related to the feature to be initiated. More detailed information and techniques related to presenting information related to a feature of a vehicle are shown in and described below in connection with FIG. 2.

Referring back to block 106, if, at 106, process 100 determines that the driver is qualified to activate the feature indicated at block 104 (“yes” at 106), process 100 can proceed to block 112 and can activate the feature. For example, in an instance where the feature indicated at block 104 uses automated steering of the vehicle, process 100 can cause a vehicle computer associated with the vehicle to begin controlling steering of the vehicle. As another example, in an instance where the feature indicated at block 104 uses automated acceleration/deceleration of the vehicle, process 100 can cause a vehicle computer associated with the vehicle to begin controlling acceleration/deceleration of the vehicle.

Note that, in some embodiments, process 100 can determine whether the feature indicated at block 104 is a safety feature that is to always be activated. For example, in some embodiments, process 100 can determine whether the feature relates to collision detection, automatic emergency braking, and/or any other suitable safety features. In some such embodiments, process 100 can determine that the feature is to be activated regardless of whether the driver has been determined to be qualified to use the feature or not.

Turning to FIG. 2, an example 200 of a process for presenting information to a driver of a vehicle related to an autonomous and/or semi-autonomous feature of the vehicle is shown in accordance with some embodiments of the disclosed subject matter. In some embodiments, blocks of process 200 can be executed on a vehicle computer associated with the vehicle.

Process 200 can begin by identifying a driver of a vehicle at 202. Similarly to as described above in connection with block 102 of FIG. 1, process 200 can identify the driver of the vehicle using any suitable technique or combination of techniques. For example, in some embodiments, the driver can log into a user account corresponding to the driver, wherein the user account is associated with a manufacturer of the vehicle, an entity providing the vehicle (e.g., a rental car company, etc.), and/or any other suitable entity. As another example, in some embodiments an identity of the driver can be determined based on information associated with a key fob used to unlock and/or start the vehicle. As a more particular example, in some embodiments, an identity of the driver can be determined based on an identifier used by the key fob to unlock and/or start the vehicle (e.g., a personal identification number, or PIN, and/or any other suitable identifier).

At 204, process 200 can identify an autonomous and/or semi-autonomous feature of the vehicle. For example, as described above, in some embodiments, the feature can be any suitable feature that uses automated control of steering of the vehicle and/or acceleration/deceleration of the vehicle. As a more particular example, in some embodiments, the feature can be a feature that relates to driving the vehicle in particular conditions (e.g., in traffic below a predetermined speed limit, on a highway, and/or in any other suitable condition), lane keeping assistance, adaptive cruise control, and/or any other suitable feature. In some embodiments, process 200 can identify a feature for which the driver of the vehicle has not yet been presented with information and/or has not yet been presented with all available information (or more than a particular percentage of available information). Additionally or alternatively, in some embodiments, process 200 can identify a feature that has been changed and/or updated since the driver previously was presented with information related to the feature.

In some embodiments, process 200 can identify the feature using any suitable technique or combination of techniques. For example, in some embodiments, process 200 can transmit a query to a server (e.g., server 302 as shown in and described below in connection with FIG. 3) that includes identifying information corresponding to the driver, and can receive a response to the query from the server that indicates a feature for which the driver has not been presented with information, has not yet been presented with all available information (or more than a particular percentage of available information), and/or has been updated since the driver was last presented with information.

Process 200 can present at least one user interface that presents information related to the identified feature at 206. In some embodiments, process 200 can present the at least one user interface while the vehicle is stationary. In some embodiments, the user interface can include any suitable information, such as explanations of settings related to the feature, how to change settings related to the feature, an explanation of how to activate the feature, an explanation of how to deactivate the feature, an indication of any suitable warnings about the feature, information indicating objects the vehicle will not detect and/or respond to while the feature is in use (e.g., traffic lights, vehicles merging into a lane, and/or any other suitable objects), an illustration of visuals that indicate where information will appear on a display of the vehicle while the feature is in use, and/or any other suitable information. FIGS. 5A and 5B (described below) show examples of user interfaces that can be presented at block 206. Note that, FIGS. 5A and 5B are shown and described below merely as examples, and, in some embodiments, any suitable user interfaces that present information related to the identified feature can be presented at block 206.

Turning to FIG. 5A, an example 500 of a user interface 500 for presenting information about settings related to the feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, in some embodiments, user interface 500 can provide information about settings related to the feature. In some embodiments, user interface 500 can include a menu 502, a feature section 504, and a group of settings 506. In some embodiments, menu 502 can be a menu for accessing any suitable settings, such as settings for one or more autonomous or semi-autonomous features associated with the vehicle, and/or any other suitable settings associated with the vehicle. In some embodiments, feature section 504 can be a menu item corresponding to a particular autonomous or semi-autonomous feature associated with the vehicle. In some embodiments, group of settings 506 can include any suitable settings relevant to the feature indicated in feature section 504. For example, as shown in FIG. 5A, group of settings 506 can include settings for changing a following distance (e.g., a distance between the vehicle and a second vehicle in front of the vehicle), settings for modifying sounds or indicators associated with warnings presented by the vehicle while the feature is in use, and/or any other suitable settings. Although not shown in FIG. 5A, in some embodiments, user interface 500 can include any suitable information or explanations associated with settings. For example, in some embodiments, a following distance setting included in group of settings 506 can include information (e.g., a text bubble, and/or information presented in any other suitable manner) that indicates an explanation of the setting, consequences of changing the setting, and/or any other suitable information.

Turning to FIG. 5B, an example 550 of a user interface for presenting safety information related to the autonomous or semi-autonomous feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, in some embodiments, user interface 500 can include information 552, which can indicate any suitable safety information. For example, in some embodiments, information 552 can indicate a level of awareness required from a driver of the vehicle (e.g., the driver must keep their hands on the steering wheel, the driver must always be looking at the road, the driver must be ready to take control of the vehicle when indicated, the driver must respond to traffic lights, and/or any other suitable information). As another example, in some embodiments, information 552 can indicate weather conditions or road conditions in which the autonomous or semi-autonomous feature is not to be used (e.g., in rain, in traffic driving below a predetermined speed limit, on non-highway roads, on highways, and/or in any other suitable conditions).

Referring back to FIG. 2, at block 208, process 200 can determine that the vehicle is in motion and that the autonomous or semi-autonomous feature identified at block 204 is available for use. In some embodiments, process 200 can determine that the autonomous or semi-autonomous feature is available for use based on any suitable information or criteria. For example, in an instance where the feature can only be used during particular weather conditions (e.g., no fog, no rain, and/or any other suitable weather conditions), process 200 can determine that the weather criteria are satisfied. As another example, in an instance where the feature can only be used when the vehicle is being driven above a predetermined speed or below a predetermined speed, process 200 can determine that the speed criteria are satisfied. As yet another example, in an instance where the feature can only be used when the vehicle is on a particular type of road (e.g., on a highway, on a non-highway, and/or any other suitable type of road), process 200 can determine that the vehicle is currently on the particular type of road. As still another example, in an instance where the feature can only be used when particular visual criteria are satisfied (e.g., lane lines are clearly visible, road signs are clearly visible, and/or any other suitable visual criteria), process 200 can determine that the visual criteria are currently satisfied using camera images from any suitable cameras associated with the vehicle. In some embodiments, process 200 can determine whether the criteria have been satisfied using any suitable information, such as information from a lidar system associated with the vehicle, information from a radar system associated with the vehicle, information from one or more cameras associated with the vehicle, information from a speedometer associated with the vehicle, and/or any other suitable systems or sensors associated with the vehicle. For example, in some embodiments, process 200 can perform a system diagnostic check to determine that all sensors used by the autonomous or semi-autonomous feature are functioning properly for safe use of the feature.

At 210, process 200 can present an indication that the autonomous or semi-autonomous feature identified at block 204 is available and information indicating how to activate the feature. For example, in some embodiments, process 200 can present a message on a display associated with the vehicle (e.g., a heads-up display, and/or any other suitable display) indicating that the feature is available and how to activate the feature (e.g., by pressing a particular button, and/or in any other suitable manner). As another example, in some embodiments, process 200 can present the message as a spoken message using speakers associated with the vehicle.

At 212, process 200 can determine that a driver of the vehicle has activated the feature, and, in response to determining that the feature has been activated, can present at least one user interface presenting additional information relating to the feature. In some embodiments, process 200 can determine that the feature has been activated using any suitable information and/or technique(s). For example, in some embodiments, process 200 can determine that a selectable input or button (e.g., on a steering wheel of the vehicle, on a dashboard of the vehicle, a selectable input presented on a user interface of a display of the vehicle, and/or any other suitable selectable input or button) associated with the feature has been selected. In some embodiments, the information presented in response to determining that the feature has been activated can be presented while the vehicle is in motion. Additionally, in some embodiments, the information presented can include information relevant to a current context of the vehicle, as shown in and described below in connection with FIGS. 6A and 6B.

Turning to FIG. 6A, an example 600 of a user interface for presenting information related to changing a setting associated with an autonomous or semi-autonomous feature of a vehicle is shown in accordance with some embodiments of the disclosed subject matter. For example, as illustrated in FIG. 6A, in some embodiments, user interface 600 can include information related to changing a particular setting associated with the feature, such as a following distance between the vehicle and a second vehicle in front of the vehicle. Continuing with this example, user interface 600 can include a first portion 610 that indicates a current setting of the feature, and second portion 620 that can indicate consequences of changing the setting. As a more particular example, first portion 610 can indicate a current position 602 of the vehicle and a current position 604 of a second vehicle in front of the vehicle at a current following distance (e.g., 150 feet, 200 feet, and/or any other suitable following distance). Continuing with this example, second portion 620 can indicate a current position 602 of the vehicle and a predicted position 624 of the second vehicle in front of the vehicle if the following distance setting were changed to a different value (e.g., 140 feet, 190 feet, and/or any other suitable distance). Note that, following distance is used merely as an example, and, in some embodiments, user interface 600 can present information related to changing any other suitable setting associated with an autonomous or semi-autonomous feature of the vehicle.

Turning to FIG. 6B, an example 650 of a user interface for presenting information indicating objects currently detected by sensors of the vehicle during use of the autonomous or semi-autonomous feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, user interface 650 can include current position 602 of the vehicle, as well as other objects that have been detected, such as a second vehicle 652, a third vehicle 654, and/or any other suitable objects. Additionally or alternatively, in some embodiments, user interface 650 can include indications 656 and/or 658 that indicate sensors or other systems of the vehicle used to detect second vehicle 652 and/or third vehicle 654, respectively. For example, in some embodiments, indications 656 and/or 658 can indicate that objects were detected using a radar system, a camera of the vehicle (e.g., a front camera, a rear camera, a side camera, and/or any other suitable camera), a lidar system, and/or any other suitable sensors or systems.

Note that, the information presented in user interface 600 and 650 of FIGS. 6A and 6B are shown merely as examples, and, in some embodiments, process 200 can present any other suitable information related to an activated autonomous or semi-autonomous feature of the vehicle at block 212. For example, in some embodiments, process 200 can present information indicating why a particular feature is available (e.g., that current weather conditions are clear, that lane lines are clearly visible by a camera system of the vehicle, and/or any other suitable information). As another example, in some embodiments, process 200 can present information indicating objects to pay attention to, such as vehicles in a blind spot, a vehicle changing lanes, and/or any other suitable objects or other information.

Note that, in some embodiments, a driver may be required to acknowledge any of the user interfaces described above in connection with blocks 206 and/or 212. For example, in some embodiments, a user interface may include a selectable input that must be selected for process 200 to determine that the driver has viewed the information included in the user interface related to the feature.

Referring back to FIG. 2, at 214, process 200 can determine that a driver of the vehicle has been presented with all available appropriate information (or more than an adequate amount of information) related to the autonomous or semi-autonomous feature. In some embodiments, process 200 can determine that all available appropriate information related to the feature (or more than an adequate amount of available information) has been presented based on any suitable information. For example, in some embodiments, process 200 can determine that information related to the feature to be presented while the vehicle is stationary (e.g., as described above in connection with block 206) and/or information related to the feature to be presented while the vehicle is in motion (e.g., as described above in connection with block 212) has been presented. As a more particular example, in some embodiments, process 200 can determine that all available information to be presented while the vehicle is stationary has been presented when a driver has been presented with and/or has acknowledged in any suitable manner all user interfaces presenting information related to the feature that are to be presented while the vehicle is stationary (or more than a predetermined number of user interfaces that are to be presented while the vehicle is stationary have been presented). As another more particular example, in some embodiments, process 200 can determine that all available information to be presented while the vehicle is in motion has been presented when a vehicle has been driven in a restricted mode while the feature has been activated for more than a predetermined time (e.g., more than ten minutes, more than an hour, and/or any other suitable time) and/or for more than a predetermined distance (e.g., more than ten miles, more than fifty miles, and/or any other suitable distance). Note that, in some such embodiments, the predetermined time or distance can be determined based on the feature. For example, in some embodiments, a predetermined distance can be relatively higher for a feature to be used during highway driving relative to a feature to be used in heavy traffic. As yet another more particular example, in some embodiments, process 200 can determine that all information to be presented while the vehicle is in motion has been completed when all user interfaces associated with a feature have been presented and/or when all information associated with the feature has been highlighted (or when more than a particular percentage of user interfaces have been presented and/or more than a particular percentage of information associated with the feature has been highlighted). As a specific example, in an instance where a particular feature is associated with two user interfaces to be presented while the vehicle is in motion and while the feature is being used, process 200 can determine that all available information related to the feature has been presented in response to determining that the two user interfaces have been presented.

In some embodiments, in response to determining that information related to the feature has been presented, process 200 can present an indication indicating that information related to the feature has been presented. For example, in some embodiments, process 200 can present a message on a display associated with the vehicle (e.g., a dashboard display, and/or any other suitable display) indicating that all information related to the feature has been presented. As another example, in some embodiments, process 200 can present a spoken messaging indicating that all information related to the feature has been presented using speakers associated with the vehicle. Additionally, in some embodiments, process 200 can additionally present information indicating how to activate a restricted mode associated with the feature again (e.g., to be presented with user interfaces and/or information while the vehicle is in motion and while the feature is in use, as described above in connection with block 212).

At 216, process 200 can store an indication that the driver has been presented with all available information (or more than a particular percentage of information) related to the autonomous or semi-autonomous feature of the vehicle. For example, in some embodiments, process 200 can transmit a message to a server (e.g., server 302 as shown in and described below in connection with FIG. 3) that includes an indication of an identity of the driver and an indication of the feature, and the server can store the indication that the driver has been presented with information related to the feature in a database stored on the server. In some embodiments, the indication that the driver has been presented with information related to the feature can be used to allow the driver to use the feature while not in a restricted mode (that is, while in an operational mode), as described above in more detail in connection with FIG. 1.

Turning to FIG. 3, an example 300 of hardware for controlling access to vehicle features that can be used in accordance with some embodiments of the disclosed subject matter is shown. As illustrated, hardware 300 can include a server 302, a communication network 304, and/or one or more vehicle computers 306, such as vehicle computers 308 and 310.

Server(s) 302 can be any suitable server(s) for storing any suitable data, programs, and/or any other suitable information. For example, in some embodiments, server(s) 302 can store indications of drivers who are qualified to activate particular autonomous or semi-autonomous features of vehicles. As a more particular example, in some embodiments, server(s) 302 can store a database that indicates users that have previously driven a particular model or type of a vehicle. As another more particular example, in some embodiments, server(s) 302 can store a database that indicates users that have previously been trained on particular features of a particular model or type of a vehicle. As another example, in some embodiments, server(s) 302 can store information used to present information related to a particular autonomous or semi-autonomous feature. As a more particular example, in some embodiments, server(s) 302 can store user interfaces used for presenting information related to a particular feature, and can transmit instructions to present the user interfaces to one or more vehicle computers.

Communication network 304 can be any suitable combination of one or more wired and/or wireless networks in some embodiments. For example, communication network 304 can include any one or more of the Internet, an intranet, a wide-area network (WAN), a local-area network (LAN), a wireless network, a digital subscriber line (DSL) network, a frame relay network, an asynchronous transfer mode (ATM) network, a virtual private network (VPN), and/or any other suitable communication network. Vehicle computers 306 can be connected by one or more communications links (e.g., communications links 312) to communication network 304 that can be linked via one or more communications links (e.g., communications links 314) to server(s) 302. The communications links can be any communications links suitable for communicating data among vehicle computers 306 and server(s) 302 such as network links, dial-up links, wireless links, hard-wired links, any other suitable communications links, or any suitable combination of such links.

Vehicle computers 306 can include any one or more computing devices operating on a vehicle, such as a car, truck, etc. In some embodiments, vehicle computers 306 can perform any suitable functions, such as the functions described above in connection with FIGS. 1 and 2. For example, in some embodiments, vehicle computers 306 can determine identifying information of a driver of the vehicle, inhibit particular features of the vehicle based on the identifying information of the driver, present user interfaces related to a particular feature of the vehicle, and/or perform any other suitable functions.

Although server(s) 302 is illustrated as one device, the functions performed by server(s) 302 can be performed using any suitable number of devices in some embodiments. For example, in some embodiments, multiple devices can be used to implement the functions performed by server(s) 302.

Although two vehicle computers 308 and 310 are shown in FIG. 3 to avoid over-complicating the figure, any suitable number of vehicle computers, and/or any suitable types of vehicle computers, can be used in some embodiments.

Server(s) 302 and vehicle computers 306 can be implemented using any suitable hardware in some embodiments. For example, in some embodiments, devices 302 and 306 can be implemented using any suitable general purpose computer or special purpose computer. For example, a vehicle computer may be implemented using a special purpose computer. Any such general purpose computer or special purpose computer can include any suitable hardware. For example, as illustrated in example hardware 400 of FIG. 4, such hardware can include hardware processor 402, memory and/or storage 404, an input device controller 406, an input device 408, display/audio drivers 410, display and audio output circuitry 412, communication interface(s) 414, an antenna 416, and a bus 418.

Hardware processor 402 can include any suitable hardware processor, such as a microprocessor, a micro-controller, digital signal processor(s), dedicated logic, and/or any other suitable circuitry for controlling the functioning of a general purpose computer or a special purpose computer in some embodiments. In some embodiments, hardware processor 402 can be controlled by a server program stored in memory and/or storage of a server, such as server(s) 302. For example, in some embodiments, the server program can cause hardware processor 402 to determine whether a particular driver is qualified to activate a particular autonomous or semi-autonomous feature, transmit instructions to inhibit a feature in response to determining that a driver is not qualified to activate the feature, and/or perform any other suitable functions. In some embodiments, hardware processor 402 can be controlled by a computer program stored in memory and/or storage 404 of vehicle computer 306. For example, the computer program can cause hardware processor 402 to inhibit a particular feature of a vehicle if a driver is determined to be not qualified to activate the feature, present user interfaces related to a feature, and/or perform any other suitable functions.

Memory and/or storage 404 can be any suitable memory and/or storage for storing programs, data, and/or any other suitable information in some embodiments. For example, memory and/or storage 404 can include random access memory, read-only memory, flash memory, hard disk storage, optical media, and/or any other suitable memory.

Input device controller 406 can be any suitable circuitry for controlling and receiving input from one or more input devices 408 in some embodiments. For example, input device controller 406 can be circuitry for receiving input from a touchscreen, from a keyboard, from one or more buttons, from a voice recognition circuit, from a microphone, from a camera, from an optical sensor, from an accelerometer, from a temperature sensor, from a near field sensor, from a pressure sensor, from an encoder, and/or any other type of input device.

Display/audio drivers 410 can be any suitable circuitry for controlling and driving output to one or more display/audio output devices 412 in some embodiments. For example, display/audio drivers 410 can be circuitry for driving a touchscreen, a flat-panel display, a cathode ray tube display, a projector, a speaker or speakers, and/or any other suitable display and/or presentation devices.

Communication interface(s) 414 can be any suitable circuitry for interfacing with one or more communication networks (e.g., computer network 304). For example, interface(s) 414 can include network interface card circuitry, wireless communication circuitry, and/or any other suitable type of communication network circuitry.

Antenna 416 can be any suitable one or more antennas for wirelessly communicating with a communication network (e.g., communication network 304) in some embodiments. In some embodiments, antenna 416 can be omitted.

Bus 418 can be any suitable mechanism for communicating between two or more components 402, 404, 406, 410, and 414 in some embodiments.

Any other suitable components can be included in hardware 400 in accordance with some embodiments.

In some embodiments, at least some of the above described blocks of the processes of FIGS. 1 and 2 can be executed or performed in any order or sequence not limited to the order and sequence shown in and described in connection with the figures. Also, some of the above blocks of FIGS. 1 and 2 can be executed or performed substantially simultaneously where appropriate or in parallel to reduce latency and processing times. Additionally or alternatively, some of the above described blocks of the processes of FIGS. 1 and 2 can be omitted.

In some embodiments, any suitable computer readable media can be used for storing instructions for performing the functions and/or processes herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include media such as non-transitory forms of magnetic media (such as hard disks, floppy disks, and/or any other suitable magnetic media), non-transitory forms of optical media (such as compact discs, digital video discs, Blu-ray discs, and/or any other suitable optical media), non-transitory forms of semiconductor media (such as flash memory, electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and/or any other suitable semiconductor media), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
 

skyote

Well-Known Member
Joined
Mar 12, 2019
Threads
55
Messages
2,725
Reaction score
5,647
Location
Austin, TX
Vehicles
Jeeps, 2500HD Duramax, R1S Preorder (Dec 2018)
It goes super deep and gets me excited, not related to sound but infotainment anyways. Its Long so if you want to see it click below.



DETAILED DESCRIPTION
In accordance with various embodiments, mechanisms (which can include methods, systems, and media) for controlling access to vehicle features are provided.

In some embodiments, the mechanisms described herein can determine whether an autonomous or semi-autonomous feature of a vehicle can be activated by a driver of the vehicle. In some embodiments, an autonomous or semi-autonomous feature can be any suitable feature that automates steering of a vehicle, acceleration/deceleration of a vehicle, and/or any other suitable function of a vehicle. For example, in some embodiments, an autonomous or semi-autonomous feature of a vehicle can control steering of the vehicle if the vehicle begins to drift out of a lane, can cause the vehicle to brake in response to detecting an object in front of the vehicle, can adjust a speed of the vehicle while the vehicle is utilizing cruise control, can park the vehicle, can control steering and/or acceleration/deceleration of the vehicle while driving in traffic, and/or can perform any other suitable autonomous or semi-autonomous function.

In some embodiments, in response to receiving an indication from a driver of a vehicle that the driver wants to activate a particular autonomous or semi-autonomous feature, the mechanisms can determine whether the driver is qualified to use the indicated feature. In some embodiments, in response to determining that the driver is not qualified to activate the indicated feature, the mechanisms can cause the feature to be inhibited or to remain inactivated. Conversely, in some embodiments, in response to determining that the driver is qualified to activate the indicated feature, the mechanisms can cause the feature to be activated.

In some embodiments, the mechanisms can determine whether a driver of a vehicle is qualified to activate a particular autonomous or semi-autonomous feature using any suitable technique or combination of techniques. For example, in some embodiments, the mechanisms can determine whether the driver is included in a group of drivers who are qualified to activate the feature. As a more particular example, in some embodiments, the mechanisms can determine whether a driver is included in the group of drivers who are qualified to activate the feature based on any suitable information or techniques, such as by determining whether an identifier associated with the driver is included in a group of identifiers corresponding to drivers qualified to activate the feature. As a specific example, in some embodiments, the mechanisms can determine whether an image capturing a face of a driver is included in a group of images of faces of qualified drivers. As another specific example, in some embodiments, the mechanisms can determine identifying information corresponding to the driver based on a key fob used to access the vehicle, and can determine whether a driver associated with the identifying information is qualified to activate the feature. As another example, in some embodiments, the mechanisms can administer a test related to the autonomous or semi-autonomous feature to a driver. As a more particular example, in some embodiments, the mechanisms can present one or more user interfaces that include questions relevant to the feature (e.g., what road conditions the feature can be used while driving on, what weather conditions are required for sensors of the vehicle to provide accurate information while using the feature, and/or any other suitable questions), and can determine that the driver is qualified to activate the feature if the driver answers more than a predetermined percentage (e.g., more than 70%, and/or any other suitable percentage) of the questions correctly). As yet another example, in some embodiments, the mechanisms can determine whether a driver has been presented with particular information related to the feature.

In some embodiments, the mechanisms can additionally present information related to an autonomous or semi-autonomous feature of a vehicle. In some embodiments, the information can include information to be presented while a vehicle is stationary and/or information that is to be presented while a vehicle is in motion and while a feature is activated (that is, while the feature is being used in a restricted mode), as described in more detail in connection with FIG. 2. In some embodiments, information presented while a vehicle is stationary can include information such as information related to settings associated with a feature, information indicating techniques to activate a feature, information indicating warnings associated with a feature, and/or any other suitable information. In some embodiments, information presented while a vehicle is in motion and while a feature is activated can include information such as indications of objects detected by sensors of the vehicle in connection with use of the feature, demonstrations of consequences of changing settings associated with the feature, and/or any other suitable information. In some embodiments, the mechanisms can cause an indication that a driver has been presented with all information related to a particular feature to be stored. In some such embodiments, the mechanisms can then allow the feature to be used in an operational mode, as described above.

Turning to FIG. 1, an example 100 of a process for controlling access to vehicle features is shown in accordance with some embodiments of the disclosed subject matter. In some embodiments, blocks of process 100 can be implemented on a computer associated with a vehicle, as shown in and described below in connection with FIG. 3.

Process 100 can begin by identifying a driver of a vehicle at 102. In some embodiments, an identity of the driver can be determined in any suitable manner and based on any suitable information. For example, in some embodiments, the driver can log into a user account corresponding to the driver, wherein the user account is associated with a manufacturer of the vehicle, an entity providing the vehicle (e.g., a rental car company, etc.), and/or any other suitable entity. As another example, in some embodiments an identity of the driver can be determined based on information associated with a key fob used to unlock and/or start the vehicle. As a more particular example, in some embodiments, an identity of the driver can be determined based on an identifier used by the key fob to unlock and/or start the vehicle (e.g., a personal identification number, or PIN, and/or any other suitable identifier). As yet another example, in some embodiments, an identity of the driver can be determined by capturing an image of a face of the driver and using any suitable image recognition techniques or facial recognition techniques to identify the driver. As still another example, in some embodiments, an identity of the driver can be determined using any other suitable biometric information (e.g., a fingerprint, and/or any other suitable biometric information). As still another example, in some embodiments, an identity of the driver can be determined using information from the driver's phone or other mobile device (e.g., via a BLUETOOTH connection between the phone and the vehicle, and/or in any other suitable manner).

At 104, process 100 can receive an indication that the driver of the vehicle wants to activate an autonomous or semi-autonomous feature of the vehicle. For example, in some embodiments, an autonomous or semi-autonomous feature of the vehicle can include any suitable feature where the vehicle automatically adjusts speed and/or steering of the vehicle. As a more particular example, in some embodiments, an autonomous or semi-autonomous feature of the vehicle can include adaptive cruise control, lane keeping assistance, automatic steering and speed control in particular conditions (e.g., in traffic driving below a predetermined speed limit, while driving on a highway, and/or any other suitable conditions), and/or one or more other suitable automated or semi-automated features. In some embodiments, process 100 can receive the indication that the driver of the vehicle wants to activate a particular feature in any suitable manner. For example, in some embodiments, process 100 can receive the indication by determining that a particular button (e.g., located on a steering wheel of the vehicle, located on a dashboard of the vehicle, and/or any other suitable button) has been pressed. As another example, in some embodiments, process 100 can determine that a selectable input to activate the feature has been selected, for example, on a user interface presented on a display of the vehicle (e.g., a dashboard display, and/or any other suitable display).

At 106, process 100 can determine whether the driver of the vehicle is qualified to activate the feature indicated at block 104. Additionally or alternatively, in some embodiments, process 100 can determine whether the feature has been updated and/or changed since a driver was last indicated as being qualified to activate the feature. In some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 using any suitable technique or combination of techniques. For example, in some embodiments, process 100 can determine whether identifying information corresponding to the driver is included in a database or list of drivers qualified to activate a particular feature. As a more particular example, in some embodiments, process 100 can use identifying information such as an image of a face of the driver, an identifier associated with a key fob used to access the vehicle, identifying information related to a BLUETOOTH connection between a mobile device of the driver and the vehicle, and/or any other suitable identifying information as well as an indication of the feature as inputs to a database that indicates whether the driver corresponding to the identifying information is qualified to activate the indicated feature.

As another example, in some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 by administering a test to the driver prior to allowing the driver to activate the feature. As a more particular example, in some embodiments, process 100 can present one or more questions (e.g., via user interfaces presented on a display of the vehicle) that are related to the feature. As a specific example, in some embodiments, process 100 can present a question that asks about road conditions (e.g., whether the feature is to be used only on a highway, whether the feature is to be used only while driving below a predetermined speed limit, and/or any other suitable road conditions) the feature is to be used with. As another specific example, in some embodiments, process 100 can present a question that asks about weather conditions (e.g., whether the feature can be used in light rain, whether the feature can be used in heavy fog, whether the feature can only be used on a clear day, and/or any other suitable weather conditions) the feature is to be used during. As yet another specific example, in some embodiments, process 100 can present a question that asks about a level of attention required of the driver while using the feature (e.g., whether the driver can safely take their hands off the steering wheel, whether the driver must be prepared to respond to traffic lights, and/or any other suitable question related to a level of attention). In some embodiments, answers to the questions can be received in any suitable manner, such as via a selection of one answer from a group of potential answers, via a spoken answer received by a microphone associated with the vehicle, and/or in any other suitable manner. In some embodiments, process 100 can determine that the driver of the vehicle is qualified to activate the feature if the driver responds correctly to more than a predetermined number or percentage of questions. Conversely, in some embodiments, process 100 can determine that the driver of the vehicle is not qualified to activate the feature if the driver responds correctly to fewer than a predetermined number or percentage of questions. In some such embodiments, process 100 can present a message indicating available information related to the feature, as described in more detail below in connection with FIG. 2.

As yet another example, in some embodiments, process 100 can determine whether the driver of the vehicle is qualified to access the feature indicated at block 104 by determining whether the driver has previously been presented with information related to the feature. In some such embodiments, process 100 can determine that the driver is qualified to access the feature if the driver been presented with all of the available information (or more than a particular percentage of the available information) related to the feature. Conversely, in some embodiments, process 100 can determine that the driver is not qualified to access the feature if the driver has not been presented with all of the information related to the feature (or has been presented with less than a particular percentage of the available information). Additionally or alternatively, in some embodiments, process 100 can determine that the driver is not qualified to activate the feature if the feature has been changed and/or updated in any suitable manner since the driver was previously presented with the information. In some embodiments, process 100 can determine whether the driver has been presented with information related to the feature using any suitable technique or combination of techniques. For example, in some embodiments, process 100 can query a database using the identifying information corresponding to the driver (e.g., as described above in connection with block 102) and the indication of the feature selected at block 104. In some such embodiments, the database can store indications of drivers who have previously been presented with information related to different autonomous or semi-autonomous features.

If, at 106, process 100 determines that the driver is not qualified to access the feature (“no” at 106), process 100 can proceed to block 108 and can inhibit activation of the feature. For example, in an instance where the feature indicated at block 104 uses automated steering of the vehicle, process 100 can inhibit automation of steering, thereby requiring that the driver of the vehicle maintain control of steering of the vehicle. As another example, in an instance where the feature indicated at block 104 uses automated acceleration/deceleration of the vehicle, process 100 can inhibit automation of acceleration/deceleration, thereby requiring that the driver of the vehicle maintain control of acceleration/deceleration of the vehicle.

Process 100 can proceed to block 110 and can present a message indicating that the feature has been blocked. In some embodiments, the message can indicate any suitable information. For example, in some embodiments, the message can indicate that the driver has not been recognized as a driver qualified to use the feature based on the identifying information. As another example, in some embodiments, the message can indicate that the driver did not pass a test related to the feature (e.g., as described above in connection with block 106). As yet another example, in some embodiments, the message can indicate that the feature has been blocked because the driver has not yet been presented with all available information (or has not yet been presented with more than a particular percentage of available information) related to the feature. As still another example, in some embodiments, the message can indicate that aspects of the feature have changed and/or have been updated in any suitable manner, and that the feature has been blocked because the driver has not been indicated as qualified to use the feature since the feature was updated. As a more particular example, in some embodiments the message can indicate a particular aspect of the feature that has been changed (e.g., that particular settings have changed, that particular warning tones or indications have changed, and/or any other suitable changes). In some embodiments, the message can additionally include information related to use of the feature. For example, in some embodiments, the message can include a selectable input, that, when selected, causes an information presentation related to the feature to be initiated. More detailed information and techniques related to presenting information related to a feature of a vehicle are shown in and described below in connection with FIG. 2.

Referring back to block 106, if, at 106, process 100 determines that the driver is qualified to activate the feature indicated at block 104 (“yes” at 106), process 100 can proceed to block 112 and can activate the feature. For example, in an instance where the feature indicated at block 104 uses automated steering of the vehicle, process 100 can cause a vehicle computer associated with the vehicle to begin controlling steering of the vehicle. As another example, in an instance where the feature indicated at block 104 uses automated acceleration/deceleration of the vehicle, process 100 can cause a vehicle computer associated with the vehicle to begin controlling acceleration/deceleration of the vehicle.

Note that, in some embodiments, process 100 can determine whether the feature indicated at block 104 is a safety feature that is to always be activated. For example, in some embodiments, process 100 can determine whether the feature relates to collision detection, automatic emergency braking, and/or any other suitable safety features. In some such embodiments, process 100 can determine that the feature is to be activated regardless of whether the driver has been determined to be qualified to use the feature or not.

Turning to FIG. 2, an example 200 of a process for presenting information to a driver of a vehicle related to an autonomous and/or semi-autonomous feature of the vehicle is shown in accordance with some embodiments of the disclosed subject matter. In some embodiments, blocks of process 200 can be executed on a vehicle computer associated with the vehicle.

Process 200 can begin by identifying a driver of a vehicle at 202. Similarly to as described above in connection with block 102 of FIG. 1, process 200 can identify the driver of the vehicle using any suitable technique or combination of techniques. For example, in some embodiments, the driver can log into a user account corresponding to the driver, wherein the user account is associated with a manufacturer of the vehicle, an entity providing the vehicle (e.g., a rental car company, etc.), and/or any other suitable entity. As another example, in some embodiments an identity of the driver can be determined based on information associated with a key fob used to unlock and/or start the vehicle. As a more particular example, in some embodiments, an identity of the driver can be determined based on an identifier used by the key fob to unlock and/or start the vehicle (e.g., a personal identification number, or PIN, and/or any other suitable identifier).

At 204, process 200 can identify an autonomous and/or semi-autonomous feature of the vehicle. For example, as described above, in some embodiments, the feature can be any suitable feature that uses automated control of steering of the vehicle and/or acceleration/deceleration of the vehicle. As a more particular example, in some embodiments, the feature can be a feature that relates to driving the vehicle in particular conditions (e.g., in traffic below a predetermined speed limit, on a highway, and/or in any other suitable condition), lane keeping assistance, adaptive cruise control, and/or any other suitable feature. In some embodiments, process 200 can identify a feature for which the driver of the vehicle has not yet been presented with information and/or has not yet been presented with all available information (or more than a particular percentage of available information). Additionally or alternatively, in some embodiments, process 200 can identify a feature that has been changed and/or updated since the driver previously was presented with information related to the feature.

In some embodiments, process 200 can identify the feature using any suitable technique or combination of techniques. For example, in some embodiments, process 200 can transmit a query to a server (e.g., server 302 as shown in and described below in connection with FIG. 3) that includes identifying information corresponding to the driver, and can receive a response to the query from the server that indicates a feature for which the driver has not been presented with information, has not yet been presented with all available information (or more than a particular percentage of available information), and/or has been updated since the driver was last presented with information.

Process 200 can present at least one user interface that presents information related to the identified feature at 206. In some embodiments, process 200 can present the at least one user interface while the vehicle is stationary. In some embodiments, the user interface can include any suitable information, such as explanations of settings related to the feature, how to change settings related to the feature, an explanation of how to activate the feature, an explanation of how to deactivate the feature, an indication of any suitable warnings about the feature, information indicating objects the vehicle will not detect and/or respond to while the feature is in use (e.g., traffic lights, vehicles merging into a lane, and/or any other suitable objects), an illustration of visuals that indicate where information will appear on a display of the vehicle while the feature is in use, and/or any other suitable information. FIGS. 5A and 5B (described below) show examples of user interfaces that can be presented at block 206. Note that, FIGS. 5A and 5B are shown and described below merely as examples, and, in some embodiments, any suitable user interfaces that present information related to the identified feature can be presented at block 206.

Turning to FIG. 5A, an example 500 of a user interface 500 for presenting information about settings related to the feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, in some embodiments, user interface 500 can provide information about settings related to the feature. In some embodiments, user interface 500 can include a menu 502, a feature section 504, and a group of settings 506. In some embodiments, menu 502 can be a menu for accessing any suitable settings, such as settings for one or more autonomous or semi-autonomous features associated with the vehicle, and/or any other suitable settings associated with the vehicle. In some embodiments, feature section 504 can be a menu item corresponding to a particular autonomous or semi-autonomous feature associated with the vehicle. In some embodiments, group of settings 506 can include any suitable settings relevant to the feature indicated in feature section 504. For example, as shown in FIG. 5A, group of settings 506 can include settings for changing a following distance (e.g., a distance between the vehicle and a second vehicle in front of the vehicle), settings for modifying sounds or indicators associated with warnings presented by the vehicle while the feature is in use, and/or any other suitable settings. Although not shown in FIG. 5A, in some embodiments, user interface 500 can include any suitable information or explanations associated with settings. For example, in some embodiments, a following distance setting included in group of settings 506 can include information (e.g., a text bubble, and/or information presented in any other suitable manner) that indicates an explanation of the setting, consequences of changing the setting, and/or any other suitable information.

Turning to FIG. 5B, an example 550 of a user interface for presenting safety information related to the autonomous or semi-autonomous feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, in some embodiments, user interface 500 can include information 552, which can indicate any suitable safety information. For example, in some embodiments, information 552 can indicate a level of awareness required from a driver of the vehicle (e.g., the driver must keep their hands on the steering wheel, the driver must always be looking at the road, the driver must be ready to take control of the vehicle when indicated, the driver must respond to traffic lights, and/or any other suitable information). As another example, in some embodiments, information 552 can indicate weather conditions or road conditions in which the autonomous or semi-autonomous feature is not to be used (e.g., in rain, in traffic driving below a predetermined speed limit, on non-highway roads, on highways, and/or in any other suitable conditions).

Referring back to FIG. 2, at block 208, process 200 can determine that the vehicle is in motion and that the autonomous or semi-autonomous feature identified at block 204 is available for use. In some embodiments, process 200 can determine that the autonomous or semi-autonomous feature is available for use based on any suitable information or criteria. For example, in an instance where the feature can only be used during particular weather conditions (e.g., no fog, no rain, and/or any other suitable weather conditions), process 200 can determine that the weather criteria are satisfied. As another example, in an instance where the feature can only be used when the vehicle is being driven above a predetermined speed or below a predetermined speed, process 200 can determine that the speed criteria are satisfied. As yet another example, in an instance where the feature can only be used when the vehicle is on a particular type of road (e.g., on a highway, on a non-highway, and/or any other suitable type of road), process 200 can determine that the vehicle is currently on the particular type of road. As still another example, in an instance where the feature can only be used when particular visual criteria are satisfied (e.g., lane lines are clearly visible, road signs are clearly visible, and/or any other suitable visual criteria), process 200 can determine that the visual criteria are currently satisfied using camera images from any suitable cameras associated with the vehicle. In some embodiments, process 200 can determine whether the criteria have been satisfied using any suitable information, such as information from a lidar system associated with the vehicle, information from a radar system associated with the vehicle, information from one or more cameras associated with the vehicle, information from a speedometer associated with the vehicle, and/or any other suitable systems or sensors associated with the vehicle. For example, in some embodiments, process 200 can perform a system diagnostic check to determine that all sensors used by the autonomous or semi-autonomous feature are functioning properly for safe use of the feature.

At 210, process 200 can present an indication that the autonomous or semi-autonomous feature identified at block 204 is available and information indicating how to activate the feature. For example, in some embodiments, process 200 can present a message on a display associated with the vehicle (e.g., a heads-up display, and/or any other suitable display) indicating that the feature is available and how to activate the feature (e.g., by pressing a particular button, and/or in any other suitable manner). As another example, in some embodiments, process 200 can present the message as a spoken message using speakers associated with the vehicle.

At 212, process 200 can determine that a driver of the vehicle has activated the feature, and, in response to determining that the feature has been activated, can present at least one user interface presenting additional information relating to the feature. In some embodiments, process 200 can determine that the feature has been activated using any suitable information and/or technique(s). For example, in some embodiments, process 200 can determine that a selectable input or button (e.g., on a steering wheel of the vehicle, on a dashboard of the vehicle, a selectable input presented on a user interface of a display of the vehicle, and/or any other suitable selectable input or button) associated with the feature has been selected. In some embodiments, the information presented in response to determining that the feature has been activated can be presented while the vehicle is in motion. Additionally, in some embodiments, the information presented can include information relevant to a current context of the vehicle, as shown in and described below in connection with FIGS. 6A and 6B.

Turning to FIG. 6A, an example 600 of a user interface for presenting information related to changing a setting associated with an autonomous or semi-autonomous feature of a vehicle is shown in accordance with some embodiments of the disclosed subject matter. For example, as illustrated in FIG. 6A, in some embodiments, user interface 600 can include information related to changing a particular setting associated with the feature, such as a following distance between the vehicle and a second vehicle in front of the vehicle. Continuing with this example, user interface 600 can include a first portion 610 that indicates a current setting of the feature, and second portion 620 that can indicate consequences of changing the setting. As a more particular example, first portion 610 can indicate a current position 602 of the vehicle and a current position 604 of a second vehicle in front of the vehicle at a current following distance (e.g., 150 feet, 200 feet, and/or any other suitable following distance). Continuing with this example, second portion 620 can indicate a current position 602 of the vehicle and a predicted position 624 of the second vehicle in front of the vehicle if the following distance setting were changed to a different value (e.g., 140 feet, 190 feet, and/or any other suitable distance). Note that, following distance is used merely as an example, and, in some embodiments, user interface 600 can present information related to changing any other suitable setting associated with an autonomous or semi-autonomous feature of the vehicle.

Turning to FIG. 6B, an example 650 of a user interface for presenting information indicating objects currently detected by sensors of the vehicle during use of the autonomous or semi-autonomous feature is shown in accordance with some embodiments of the disclosed subject matter. As illustrated, user interface 650 can include current position 602 of the vehicle, as well as other objects that have been detected, such as a second vehicle 652, a third vehicle 654, and/or any other suitable objects. Additionally or alternatively, in some embodiments, user interface 650 can include indications 656 and/or 658 that indicate sensors or other systems of the vehicle used to detect second vehicle 652 and/or third vehicle 654, respectively. For example, in some embodiments, indications 656 and/or 658 can indicate that objects were detected using a radar system, a camera of the vehicle (e.g., a front camera, a rear camera, a side camera, and/or any other suitable camera), a lidar system, and/or any other suitable sensors or systems.

Note that, the information presented in user interface 600 and 650 of FIGS. 6A and 6B are shown merely as examples, and, in some embodiments, process 200 can present any other suitable information related to an activated autonomous or semi-autonomous feature of the vehicle at block 212. For example, in some embodiments, process 200 can present information indicating why a particular feature is available (e.g., that current weather conditions are clear, that lane lines are clearly visible by a camera system of the vehicle, and/or any other suitable information). As another example, in some embodiments, process 200 can present information indicating objects to pay attention to, such as vehicles in a blind spot, a vehicle changing lanes, and/or any other suitable objects or other information.

Note that, in some embodiments, a driver may be required to acknowledge any of the user interfaces described above in connection with blocks 206 and/or 212. For example, in some embodiments, a user interface may include a selectable input that must be selected for process 200 to determine that the driver has viewed the information included in the user interface related to the feature.

Referring back to FIG. 2, at 214, process 200 can determine that a driver of the vehicle has been presented with all available appropriate information (or more than an adequate amount of information) related to the autonomous or semi-autonomous feature. In some embodiments, process 200 can determine that all available appropriate information related to the feature (or more than an adequate amount of available information) has been presented based on any suitable information. For example, in some embodiments, process 200 can determine that information related to the feature to be presented while the vehicle is stationary (e.g., as described above in connection with block 206) and/or information related to the feature to be presented while the vehicle is in motion (e.g., as described above in connection with block 212) has been presented. As a more particular example, in some embodiments, process 200 can determine that all available information to be presented while the vehicle is stationary has been presented when a driver has been presented with and/or has acknowledged in any suitable manner all user interfaces presenting information related to the feature that are to be presented while the vehicle is stationary (or more than a predetermined number of user interfaces that are to be presented while the vehicle is stationary have been presented). As another more particular example, in some embodiments, process 200 can determine that all available information to be presented while the vehicle is in motion has been presented when a vehicle has been driven in a restricted mode while the feature has been activated for more than a predetermined time (e.g., more than ten minutes, more than an hour, and/or any other suitable time) and/or for more than a predetermined distance (e.g., more than ten miles, more than fifty miles, and/or any other suitable distance). Note that, in some such embodiments, the predetermined time or distance can be determined based on the feature. For example, in some embodiments, a predetermined distance can be relatively higher for a feature to be used during highway driving relative to a feature to be used in heavy traffic. As yet another more particular example, in some embodiments, process 200 can determine that all information to be presented while the vehicle is in motion has been completed when all user interfaces associated with a feature have been presented and/or when all information associated with the feature has been highlighted (or when more than a particular percentage of user interfaces have been presented and/or more than a particular percentage of information associated with the feature has been highlighted). As a specific example, in an instance where a particular feature is associated with two user interfaces to be presented while the vehicle is in motion and while the feature is being used, process 200 can determine that all available information related to the feature has been presented in response to determining that the two user interfaces have been presented.

In some embodiments, in response to determining that information related to the feature has been presented, process 200 can present an indication indicating that information related to the feature has been presented. For example, in some embodiments, process 200 can present a message on a display associated with the vehicle (e.g., a dashboard display, and/or any other suitable display) indicating that all information related to the feature has been presented. As another example, in some embodiments, process 200 can present a spoken messaging indicating that all information related to the feature has been presented using speakers associated with the vehicle. Additionally, in some embodiments, process 200 can additionally present information indicating how to activate a restricted mode associated with the feature again (e.g., to be presented with user interfaces and/or information while the vehicle is in motion and while the feature is in use, as described above in connection with block 212).

At 216, process 200 can store an indication that the driver has been presented with all available information (or more than a particular percentage of information) related to the autonomous or semi-autonomous feature of the vehicle. For example, in some embodiments, process 200 can transmit a message to a server (e.g., server 302 as shown in and described below in connection with FIG. 3) that includes an indication of an identity of the driver and an indication of the feature, and the server can store the indication that the driver has been presented with information related to the feature in a database stored on the server. In some embodiments, the indication that the driver has been presented with information related to the feature can be used to allow the driver to use the feature while not in a restricted mode (that is, while in an operational mode), as described above in more detail in connection with FIG. 1.

Turning to FIG. 3, an example 300 of hardware for controlling access to vehicle features that can be used in accordance with some embodiments of the disclosed subject matter is shown. As illustrated, hardware 300 can include a server 302, a communication network 304, and/or one or more vehicle computers 306, such as vehicle computers 308 and 310.

Server(s) 302 can be any suitable server(s) for storing any suitable data, programs, and/or any other suitable information. For example, in some embodiments, server(s) 302 can store indications of drivers who are qualified to activate particular autonomous or semi-autonomous features of vehicles. As a more particular example, in some embodiments, server(s) 302 can store a database that indicates users that have previously driven a particular model or type of a vehicle. As another more particular example, in some embodiments, server(s) 302 can store a database that indicates users that have previously been trained on particular features of a particular model or type of a vehicle. As another example, in some embodiments, server(s) 302 can store information used to present information related to a particular autonomous or semi-autonomous feature. As a more particular example, in some embodiments, server(s) 302 can store user interfaces used for presenting information related to a particular feature, and can transmit instructions to present the user interfaces to one or more vehicle computers.

Communication network 304 can be any suitable combination of one or more wired and/or wireless networks in some embodiments. For example, communication network 304 can include any one or more of the Internet, an intranet, a wide-area network (WAN), a local-area network (LAN), a wireless network, a digital subscriber line (DSL) network, a frame relay network, an asynchronous transfer mode (ATM) network, a virtual private network (VPN), and/or any other suitable communication network. Vehicle computers 306 can be connected by one or more communications links (e.g., communications links 312) to communication network 304 that can be linked via one or more communications links (e.g., communications links 314) to server(s) 302. The communications links can be any communications links suitable for communicating data among vehicle computers 306 and server(s) 302 such as network links, dial-up links, wireless links, hard-wired links, any other suitable communications links, or any suitable combination of such links.

Vehicle computers 306 can include any one or more computing devices operating on a vehicle, such as a car, truck, etc. In some embodiments, vehicle computers 306 can perform any suitable functions, such as the functions described above in connection with FIGS. 1 and 2. For example, in some embodiments, vehicle computers 306 can determine identifying information of a driver of the vehicle, inhibit particular features of the vehicle based on the identifying information of the driver, present user interfaces related to a particular feature of the vehicle, and/or perform any other suitable functions.

Although server(s) 302 is illustrated as one device, the functions performed by server(s) 302 can be performed using any suitable number of devices in some embodiments. For example, in some embodiments, multiple devices can be used to implement the functions performed by server(s) 302.

Although two vehicle computers 308 and 310 are shown in FIG. 3 to avoid over-complicating the figure, any suitable number of vehicle computers, and/or any suitable types of vehicle computers, can be used in some embodiments.

Server(s) 302 and vehicle computers 306 can be implemented using any suitable hardware in some embodiments. For example, in some embodiments, devices 302 and 306 can be implemented using any suitable general purpose computer or special purpose computer. For example, a vehicle computer may be implemented using a special purpose computer. Any such general purpose computer or special purpose computer can include any suitable hardware. For example, as illustrated in example hardware 400 of FIG. 4, such hardware can include hardware processor 402, memory and/or storage 404, an input device controller 406, an input device 408, display/audio drivers 410, display and audio output circuitry 412, communication interface(s) 414, an antenna 416, and a bus 418.

Hardware processor 402 can include any suitable hardware processor, such as a microprocessor, a micro-controller, digital signal processor(s), dedicated logic, and/or any other suitable circuitry for controlling the functioning of a general purpose computer or a special purpose computer in some embodiments. In some embodiments, hardware processor 402 can be controlled by a server program stored in memory and/or storage of a server, such as server(s) 302. For example, in some embodiments, the server program can cause hardware processor 402 to determine whether a particular driver is qualified to activate a particular autonomous or semi-autonomous feature, transmit instructions to inhibit a feature in response to determining that a driver is not qualified to activate the feature, and/or perform any other suitable functions. In some embodiments, hardware processor 402 can be controlled by a computer program stored in memory and/or storage 404 of vehicle computer 306. For example, the computer program can cause hardware processor 402 to inhibit a particular feature of a vehicle if a driver is determined to be not qualified to activate the feature, present user interfaces related to a feature, and/or perform any other suitable functions.

Memory and/or storage 404 can be any suitable memory and/or storage for storing programs, data, and/or any other suitable information in some embodiments. For example, memory and/or storage 404 can include random access memory, read-only memory, flash memory, hard disk storage, optical media, and/or any other suitable memory.

Input device controller 406 can be any suitable circuitry for controlling and receiving input from one or more input devices 408 in some embodiments. For example, input device controller 406 can be circuitry for receiving input from a touchscreen, from a keyboard, from one or more buttons, from a voice recognition circuit, from a microphone, from a camera, from an optical sensor, from an accelerometer, from a temperature sensor, from a near field sensor, from a pressure sensor, from an encoder, and/or any other type of input device.

Display/audio drivers 410 can be any suitable circuitry for controlling and driving output to one or more display/audio output devices 412 in some embodiments. For example, display/audio drivers 410 can be circuitry for driving a touchscreen, a flat-panel display, a cathode ray tube display, a projector, a speaker or speakers, and/or any other suitable display and/or presentation devices.

Communication interface(s) 414 can be any suitable circuitry for interfacing with one or more communication networks (e.g., computer network 304). For example, interface(s) 414 can include network interface card circuitry, wireless communication circuitry, and/or any other suitable type of communication network circuitry.

Antenna 416 can be any suitable one or more antennas for wirelessly communicating with a communication network (e.g., communication network 304) in some embodiments. In some embodiments, antenna 416 can be omitted.

Bus 418 can be any suitable mechanism for communicating between two or more components 402, 404, 406, 410, and 414 in some embodiments.

Any other suitable components can be included in hardware 400 in accordance with some embodiments.

In some embodiments, at least some of the above described blocks of the processes of FIGS. 1 and 2 can be executed or performed in any order or sequence not limited to the order and sequence shown in and described in connection with the figures. Also, some of the above blocks of FIGS. 1 and 2 can be executed or performed substantially simultaneously where appropriate or in parallel to reduce latency and processing times. Additionally or alternatively, some of the above described blocks of the processes of FIGS. 1 and 2 can be omitted.

In some embodiments, any suitable computer readable media can be used for storing instructions for performing the functions and/or processes herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include media such as non-transitory forms of magnetic media (such as hard disks, floppy disks, and/or any other suitable magnetic media), non-transitory forms of optical media (such as compact discs, digital video discs, Blu-ray discs, and/or any other suitable optical media), non-transitory forms of semiconductor media (such as flash memory, electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and/or any other suitable semiconductor media), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
Yeah, that info pertains to a patent filed awhile back.

I've been eagerly awaiting audio system details, and it looks like Rivian Elevation might just be their branding for an audio system (my guess is the premium option). I wasn't able to find any additional details...
 

EyeOnRivian

Well-Known Member
Joined
Feb 14, 2019
Threads
40
Messages
491
Reaction score
435
Location
Chicagoland
Vehicles
Mach-E 4X, Mitsubishi Endeavor. Pre-order: R1S LE
Serial Number: 88801980 - "Providing purchase advisory and consultancy services to consumers for the purchase of land vehicles; retail store services ..."

This looks vaguely intriguing to where I wouldn't mind reading more about it but I'm not having any luck with my internet searches.
 

Sponsored

skyote

Well-Known Member
Joined
Mar 12, 2019
Threads
55
Messages
2,725
Reaction score
5,647
Location
Austin, TX
Vehicles
Jeeps, 2500HD Duramax, R1S Preorder (Dec 2018)
All are interesting.

The ones that caught my attention as a driver were front dig mode & ventilation. I'm very curious how well the ventilation will work with the unique design...I'm the guy that likes the AC blowing directly on me here in TX during the heat of the summer.
 

Sponsored

OP
OP
kanundrum

kanundrum

Well-Known Member
Joined
May 2, 2020
Threads
217
Messages
3,976
Reaction score
12,107
Location
Washington, DC
Vehicles
Giulia QV, R1S (S00N)
Occupation
IT
Clubs
 
BATTERY MODULE GAS SENSOR FOR BATTERY CELL MONITORING

Battery monitoring systems and methods are provided. The battery monitoring system may include a battery module and battery management circuitry. The battery module comprises battery cells and a gas sensor configured to detect the presence of gas within the battery module. The battery management circuitry is configured to receive a sensor signal from the gas sensor, determine whether the sensor signal indicates the presence of gas within the battery module, and in response to determining that the sensor signal indicates the presence of gas, take an action. The action may include increasing cooling to the battery cells, limiting a maximum load that can be applied to the battery module, disconnecting the battery module, or providing a warning. The battery module may also include a component that was doped with a chemical that begins to off-gas above an activation temperature. The gas sensor may be configured to detect the chemical.


Rivian R1T R1S Cool Rivian Patent/TradeMark Info 1599678646568
 
OP
OP
kanundrum

kanundrum

Well-Known Member
Joined
May 2, 2020
Threads
217
Messages
3,976
Reaction score
12,107
Location
Washington, DC
Vehicles
Giulia QV, R1S (S00N)
Occupation
IT
Clubs
 
https://patentscope.wipo.int/search...20&tab=NATIONALBIBLIO&_cid=P21-KHW65G-65981-1

Rivian R1T R1S Cool Rivian Patent/TradeMark Info 1606233969904


In some embodiments, the present disclosure is directed to a kitchen module for a vehicle. The kitchen module includes a frame system and a plurality of kitchen components. The frame system provides structural support. In some embodiments, a rail system is affixed to the frame system and the vehicle, and is configured to allow the frame system to move relative to the vehicle. The plurality of kitchen components are mounted to the frame system. For example, the plurality of kitchen components may be any of a sink, a potable water tank, a rangetop, at least one drawer, a countertop, any other suitable components, or any combination thereof. For example, in some embodiments, the sink is coupled by a plumbing system to a potable water tank. In a further example, in some embodiments, the rangetop is an induction rangetop. In a further example, in some embodiments, the at least one drawer includes an end drawer arranged between rails of the rail system. In a further example, in some embodiments, the countertop includes at least one section that is removable. In a further example, in some embodiments, the countertop includes at least two parts coupled by a hinge that can rotate relative to one another. In a further example, in some embodiments, the at least one drawer includes a recess to accommodate another component. In a further example, in some embodiments, the rail system includes rail members arranged to slide relative to each other, thus allowing axial motion of the frame system relative to the vehicle. In some embodiments, the kitchen module has a tapered or slanted cross-section, when collapsed or stowed, to fit between a rear seat and cargo compartment or bed.

In some embodiments, the present disclosure is directed to a vehicle having a storage compartment and a kitchen module. The kitchen module includes a frame system for providing structural support, a rail system affixed to the frame system and the vehicle, and a plurality of kitchen components mounted to the frame system. The rail system is configured to allow the frame system to move relative to the vehicle. The rail system is affixed to a surface of the storage compartment, and the kitchen module is arranged to be extended from and retracted into the storage compartment.

In some embodiments, the vehicle includes an electrical extension connecting an electric power source of the vehicle to the kitchen module. In some embodiments, the vehicle is an electric vehicle and the electric power source of the vehicle includes a battery module that also provides power to an electric drivetrain of the vehicle. In some embodiments, the electrical extension is coupled to at least one of an actuator of the rail system and a rangetop.

In some embodiments, the vehicle includes an air compression extension connecting an air compressor system of the vehicle to the kitchen module.

In some embodiments, the vehicle includes an outer panel that is arranged to be part of the vehicle exterior when the kitchen module is retracted into the storage compartment.

In some embodiments, the vehicle includes an occupant compartment and a cargo bed. In some such embodiments, the storage compartment is arranged between an occupant compartment and a carbo bed.

In some embodiments, the kitchen module includes at least one of a sink, a potable water tank, a rangetop, at least one drawer, or a countertop. In some embodiments, the kitchen module includes a sink and a portable water tank, and the sink is coupled by a plumbing system to the potable water tank. In some embodiments, the kitchen module includes a countertop that folds out to form a horizontal surface. In some embodiments, the kitchen module includes an exterior body surface comprising an opening, wherein the storage compartment is within the opening.

In some embodiments, the present disclosure is directed to a modular kitchen system for a vehicle. The modular kitchen system may include two or more submodules that are usable together to form the modular kitchen system. For example, the submodules may include one or more of a sink submodule, a rangetop submodule, and a cooler or refrigerator submodule. In some embodiments, the modular kitchen system comprises a shuttle system, on which the submodules can be mounted and secured.

In some embodiments, the present disclosure is directed to a shuttle system that can be extended from either side of a vehicle. In some embodiments, the shuttle system can be extended from a lateral storage compartment that includes covers on both sides of the vehicle. In some embodiments, the shuttle system may include a release mechanism and handle on each side such that it may be pulled and retracted from either side. In some embodiments, the shuttle system comprises a two-way rail system that enables the shuttle system to extend out of both sides of the vehicle.
 
OP
OP
kanundrum

kanundrum

Well-Known Member
Joined
May 2, 2020
Threads
217
Messages
3,976
Reaction score
12,107
Location
Washington, DC
Vehicles
Giulia QV, R1S (S00N)
Occupation
IT
Clubs
 
Rivian R1T R1S Cool Rivian Patent/TradeMark Info 1606234100310


A configurable battery system may be arranged in such a way that two battery modules are connected in parallel to achieve a target maximum voltage for a load, or in series to achieve a high voltage ofabout double the target maximum voltage. Fast charging, at high voltage, may allow both battery modules to be charged at a charging current near a desired maximum current at the battery charger. A battery management module determines a switch configuration, coupling the battery modules in series or parallel. The battery management module applies the switch configuration to one or more switches tomanage charging of the battery modules. The battery management module may receive charger capability information, local charging information, and fault information to aid in determining a switch configuration.

https://patentscope.wipo.int/search/en/detail.jsf?docId=CN300661778&_cid=P21-KHW65G-65981-1
Sponsored

 
 




Top