The AHA Model  Revision: 12809
Reference implementation 04 (HEDG02_04)
the_neurobio Module Reference

Definition of the decision making and behavioural the architecture. More...

Data Types

type  percept_food
 This type defines how the agent perceives food items. The food perception object the_neurobio::percept_food is basically an array of food objects within the visual range of the agent plus distances to the agent. This is the "objective" perception container, reflecting the "real world". We introduce a perception error when perception object is analysed by the agent's neurobiological system. More...
 
type  spatial_percept_component
 This type defines a single spatial perception component, i.e. some single elementary spatial object that can be perceived by the agent from a big array of objects of the same type which are available in the agent's environment. Different kinds of perception objects (e.g. conspecifics, predators etc.) can be produced by extending this basic type. More...
 
type  conspec_percept_comp
 This type defines a single conspecific perception component. It is required for the the_neurobio::percept_conspecifics type that defines the whole conspecifics perception object (array of conspecifics). More...
 
type  percept_conspecifics
 This type defines how the agent perceives conspecifics. More...
 
type  spatialobj_percept_comp
 This type defines a single arbitrary spatial object perception component. For example, a predator perception object is then an array of such spatial object perception components. More...
 
type  percept_predator
 This type defines how the agent perceives a predator. More...
 
type  percept_stomach
 This type defines how the agent perceives its own stomach capacity. More...
 
type  percept_body_mass
 This type defines how the agent perceives its own body mass it can be important for state-dependency. More...
 
type  percept_energy
 This type defines how the agent perceives its own energy reserves it can be important for state-dependency. More...
 
type  percept_age
 This type defines how the agent perceives its own age in terms of the model discrete time step. More...
 
type  percept_reprfact
 Perception of the reproductive factor, reproductive factor depends on the sex hormones differently in males and females. More...
 
type  percept_light
 Perception of the ambient illumination. This is a very simple perception component, singular and static. More...
 
type  percept_depth
 Perception of the current depth horizon. More...
 
type  memory_perceptual
 Individual perception memory(history) stack, a memory component that saves perception values at previous time steps of the model. Not whole perception objects are saved for simplicity, only the most important parameters, integer and real types so commondata::add_to_history() can be used in unmodified form. Decision making can make use of this memory stack. More...
 
type  perception
 The perception architecture of the agent. See "The perception mechanism" for a general overview. At this level, lower order perception objects are combined into the the_neurobio::perception class hierarchy level of the agent. The object bound functions see_ and feel_ obtain (set) the specific perception objects from the external or internal environments of the agent and put them into the the_neurobio::perception data structure. Also, memory component is updated with the perception data. Perception objects can then be used as input into the individual decision-making procedures. More...
 
type  percept_components_motiv
 Perceptual components of motivational states. Plugged into all STATE_, attention etc. These components are linked to specific inner or outer perception objects (stimuli). Their sum result(s) in the overall value of the motivation component. More...
 
type  state_motivation_base
 These types describe the neurobiological states of the agent. (1) Each state may have several components that are related to specific inner or outer perception objects (stimuli). (2) There is also a motivation component that describes the global motivation value for this state. More...
 
interface  motivation_init_root
 Abstract interface for the deferred init function clean_init that has to be overridden by each object that extends the basic motivational state type. More...
 
type  state_hunger
 The motivational state of hunger. Evokes food seeking, eating, higher activity, emigrating and habitat switching. More...
 
type  state_fear_defence
 The state of fear state. Evokes active escape, fleeing, emigration and habitat switch. More...
 
type  state_reproduce
 The state of reproduction. Evokes seeking conspecifics and mating during the reproductive phase. More...
 
type  motivation
 Motivation is a collection of all internal motivational states of the agent. This type is also used in defining Expectancies of motivations. More...
 
type  memory_emotional
 Individual motivation/emotion memory stack, a memory component that saves the values of the final motivations at previous time steps of the model. Not whole state (STATE_) objects are saved for simplicity. add_to_history is used in unmodified form. Decision making can make use of this emotional memory stack. More...
 
type  appraisal
 The appraisal level. At this level, perception objects are feed into the commondata::gamma2gene() sigmoid function and the neuronal responses are obtained at the output. Neuronal responses for different perception objects are then summed up and the promary motivation values are obtained. Following this, modulation alters some of the primary motivation values resulting in the final motivation values. See "From perception to GOS" for an overview. More...
 
type  gos_global
 Global organismic state (GOS) level. GOS is defined by the dominant motivational state component (STATE_), namely, by the logical flag %dominant_state. If this logical flag is TRUE for a particular motivational state component, this state is the GOS. Thus, there should be is no separate data component(s) e.g. "value" for GOS. The values the_neurobio::gos_global::gos_main and the_neurobio::gos_global::gos_arousal can be inferred from the motivations, here are doubled mainly for convenience. See "From perception to GOS" for an overview. More...
 

Functions/Subroutines

elemental subroutine percept_food_create_init (this, maximum_number_food_items)
 Initiate an empty food perception object with known number of components. More...
 
subroutine percept_food_number_seen (this, number_set)
 Set the total number of food items perceived (seen) in the food perception object. Do not reallocate the perception object components with respect to this new number yet. More...
 
subroutine percept_food_make_fill_arrays (this, items, dist)
 Make the food perception object, fill it with the actual data arrays. More...
 
elemental integer function percept_food_get_count_found (this)
 Get the number (count) of food items seen. Trivial. More...
 
elemental real(srp) function percept_food_get_meansize_found (this)
 Get the average size of food items seen. Trivial. More...
 
elemental real(srp) function percept_food_get_meanmass_found (this)
 Get the average mass of food items seen. Trivial. More...
 
elemental real(srp) function percept_food_get_meandist_found (this)
 Get the average distance to the food items seen. Trivial. More...
 
elemental subroutine percept_food_destroy_deallocate (this)
 Deallocate and delete a food perception object. More...
 
subroutine food_perception_get_visrange_objects (this, food_resource_available, time_step_model)
 Get available food items within the visual range of the agent, which the agent can perceive and therefore respond to. Food perception is packaged into the food perception object this%perceive_food for output. More...
 
elemental logical function food_perception_is_seeing_food (this)
 Check if the agent sees any food items within its visual range. More...
 
real(srp) function food_perception_probability_capture_memory_object (this, last, time_step_model)
 Calculate the probability of capture of a subjective representation of food item based on the data from the perceptual memory stack. More...
 
elemental subroutine percept_stomach_create_init (this)
 Initiate an empty stomach capacity perception object. More...
 
elemental real(srp) function percept_stomach_get_avail_capacity (this)
 Get the currently available value of the available stomach volume. More...
 
subroutine percept_stomach_update_avail_capacity (this, current_volume)
 Set and update the currently available value of the available stomach volume. More...
 
elemental subroutine percept_stomach_destroy_deallocate (this)
 Destroy the stomach perception object and deallocate it. More...
 
elemental subroutine percept_bodymass_create_init (this)
 Initiate an empty body mass perception object. More...
 
elemental real(srp) function percept_bodymass_get_current (this)
 Get the current value of the body mass perception. More...
 
subroutine percept_bodymass_update_current (this, current)
 Set and update the current body mass perception value. More...
 
elemental subroutine percept_bodymass_destroy_deallocate (this)
 Destroy the body mass perception object and deallocate. More...
 
elemental subroutine percept_energy_create_init (this)
 Initiate an empty energy perception object. More...
 
elemental real(srp) function percept_energy_get_current (this)
 Get the current value of the energy reserves. More...
 
subroutine percept_energy_update_current (this, current)
 Set and update the current energy perception value. More...
 
elemental subroutine percept_energy_destroy_deallocate (this)
 Destroy the energy perception object and deallocate. More...
 
elemental subroutine percept_age_create_init (this)
 Initiate an empty age perception object. More...
 
elemental integer function percept_age_get_current (this)
 Get the current value of the age reserves. More...
 
subroutine percept_age_update_current (this, current)
 Set and update the current age perception value. More...
 
elemental subroutine percept_age_destroy_deallocate (this)
 Destroy the age perception object and deallocate it. More...
 
subroutine spatial_percept_set_cid (this, id)
 Set unique id for the conspecific perception component. More...
 
elemental integer function spatial_percept_get_cid (this)
 Get the unique id of the food item object. More...
 
elemental subroutine consp_percept_comp_create (this)
 Create a single conspecific perception component at an undefined position with default properties. More...
 
subroutine consp_percept_make (this, location, size, mass, dist, cid, is_male)
 Make a single conspecific perception component. This is a single conspecific located within the visual range of the agent. More...
 
elemental real(srp) function consp_percept_get_size (this)
 Get the conspecific perception component body size. More...
 
elemental real(srp) function consp_percept_get_mass (this)
 Get the conspecific perception component body mass. More...
 
elemental real(srp) function consp_percept_get_dist (this)
 Get the conspecific perception component distance. More...
 
elemental logical function consp_percept_sex_is_male_get (this)
 Get the conspecific perception component sex flag (male). More...
 
elemental logical function consp_percept_sex_is_female_get (this)
 Get the conspecific perception component sex flag (female). More...
 
elemental subroutine percept_consp_create_init (this, maximum_number_conspecifics)
 Create conspecifics perception object, it is an array of conspecific perception components. More...
 
elemental subroutine percept_consp_number_seen (this, number_set)
 Set the total number of conspecifics perceived (seen) in the conspecific perception object. But do not reallocate the conspecific perception components so far. More...
 
pure subroutine percept_consp_make_fill_arrays (this, consps)
 Make the conspecifics perception object, fill it with the actual arrays. More...
 
elemental integer function percept_consp_get_count_seen (this)
 Get the number (count) of conspecifics seen. Trivial. More...
 
elemental subroutine percept_consp_destroy_deallocate (this)
 Deallocate and delete a conspecific perception object. More...
 
subroutine consp_perception_get_visrange_objects (this, consp_agents, time_step_model)
 Get available conspecific perception objects within the visual range of the agent, which the agent can perceive and therefore respond to. More...
 
elemental logical function consp_perception_is_seeing_conspecifics (this)
 Check if the agent sees any conspecifics within the visual range. More...
 
elemental subroutine spatialobj_percept_comp_create (this)
 Create a single arbitrary spatial object perception component at an undefined position with default properties. More...
 
subroutine spatialobj_percept_make (this, location, size, dist, cid)
 Make a single arbitrary spatial object perception component. More...
 
elemental real(srp) function spatialobj_percept_get_size (this)
 Get an arbitrary spatial object perception component size. More...
 
elemental real(srp) function spatialobj_percept_get_dist (this)
 Get the distance to an arbitrary spatial object perception component. More...
 
real(srp) function spatialobj_percept_visibility_visual_range (this, object_area, contrast, time_step_model)
 Calculate the visibility range of this spatial object. Wrapper to the visual_range function. This function calculates the distance from which this object can be seen by a visual object (e.g. predator or prey). More...
 
elemental subroutine percept_predator_create_init (this, maximum_number_predators)
 Create predator perception object, it is an array of spatial perception components. More...
 
elemental subroutine percept_predator_number_seen (this, number_set)
 Set the total number of predators perceived (seen) in the predator perception object. But do not reallocate the predator perception components so far. More...
 
pure subroutine percept_predator_make_fill_arrays (this, preds, attack_rate)
 Make the predator perception object, fill it with the actual arrays. More...
 
pure subroutine percept_predator_set_attack_rate_vector (this, attack_rate)
 Set an array of the attack rates for the predator perception object. More...
 
pure subroutine percept_predator_set_attack_rate_scalar (this, attack_rate)
 Set an array of the attack rates for the predator perception object. More...
 
elemental integer function percept_predator_get_count_seen (this)
 Get the number (count) of predators seen. Trivial. More...
 
elemental subroutine percept_predator_destroy_deallocate (this)
 Deallocate and delete a predator perception object. More...
 
subroutine predator_perception_get_visrange_objects (this, spatl_agents, time_step_model)
 Get available predators perception objects within the visual range of the agent, which the agent can perceive and therefore respond to. More...
 
elemental logical function predator_perception_is_seeing_predators (this)
 Check if the agent sees any predators within the visual range. More...
 
elemental subroutine percept_light_create_init (this)
 Make en empty light perception component. Really necessary only when perception objects are all allocatable. More...
 
elemental real(srp) function percept_light_get_current (this)
 Get the current perception of the illumination. More...
 
subroutine percept_light_set_current (this, timestep, depth)
 Set the current light level into the perception component. More...
 
elemental subroutine percept_light_destroy_deallocate (this)
 Destroy / deallocate light perception component. Really necessary only when perception objects are all allocatable. More...
 
subroutine light_perception_get_object (this, time_step_model)
 Get light perception objects into the individual PERCEPTION object layer. More...
 
elemental subroutine percept_depth_create_init (this)
 Make en empty depth perception component. Really necessary only when perception objects are all allocatable. More...
 
elemental real(srp) function percept_depth_get_current (this)
 Get the current perception of the depth. More...
 
subroutine percept_depth_set_current (this, cdepth)
 Set the current depth level into the perception component. More...
 
elemental subroutine percept_depth_destroy_deallocate (this)
 Destroy / deallocate depth perception component. Really necessary only when perception objects are all allocatable. More...
 
elemental subroutine percept_reprfac_create_init (this)
 Make en empty reproductive factor perception component. Really necessary only when perception objects are all allocatable. More...
 
elemental real(srp) function percept_reprfac_get_current (this)
 Get the current perception of the reproductive factor. More...
 
subroutine percept_reprfac_set_current (this, reprfac)
 Set the current reproductive factor level into perception component. More...
 
elemental subroutine percept_reprfac_destroy_deallocate (this)
 Destroy / deallocate reproductive factor perception component. Really necessary only when perception objects are all allocatable. More...
 
subroutine depth_perception_get_object (this)
 Get depth perception objects into the individual PERCEPTION object layer. More...
 
subroutine stomach_perception_get_object (this)
 Get the stomach capacity perception objects into the individual PERCEPTION object layer. More...
 
subroutine bodymass_perception_get_object (this)
 Get the body mass perception objects into the individual PERCEPTION object layer. More...
 
subroutine energy_perception_get_object (this)
 Get the energy reserves perception objects into the individual PERCEPTION object layer. More...
 
subroutine age_perception_get_object (this)
 Get the age perception objects into the individual PERCEPTION object layer. More...
 
subroutine repfac_perception_get_object (this)
 Get the reproductive factor perception objects into the individual PERCEPTION object layer. More...
 
elemental subroutine percept_memory_add_to_stack (this, light, depth, food, foodsize, fooddist, consp, pred, stom, bdmass, energ, reprfac)
 Add perception components into the memory stack. More...
 
elemental subroutine percept_memory_cleanup_stack (this)
 Cleanup and destroy the perceptual memory stack. More...
 
elemental integer function percept_memory_food_get_total (this)
 Get the total number of food items within the whole perceptual memory stack. More...
 
elemental real(srp) function percept_memory_food_get_mean_n (this, last)
 Get the average number of food items per single time step within the whole perceptual memory stack. More...
 
elemental subroutine percept_memory_food_mean_n_split (this, window, split_val, older, newer)
 Get the average number of food items per single time step within the perceptual memory stack, split to the first (older) and second (newer) parts. The whole memory stack ('sample') is split by the split_val parameter and two means are calculated: before the split_val and after it. More...
 
elemental real(srp) function percept_memory_food_get_mean_size (this, last)
 Get the average size of food item per single time step within the whole perceptual memory stack. More...
 
elemental subroutine percept_memory_food_mean_size_split (this, window, split_val, older, newer)
 Get the average size of food items per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack 'sample' is split by the split_val parameter and two means are calculated: before the split_val and after it. More...
 
elemental real(srp) function percept_memory_food_get_mean_dist (this, last, undef_ret_null)
 Get the average distance to food item per single time step within the whole perceptual memory stack. More...
 
elemental subroutine percept_memory_food_mean_dist_split (this, window, split_val, older, newer)
 Get the average distance to food items per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack 'sample' is split by the split_val parameter and two means are calculated: before the split_val and after it. More...
 
elemental real(srp) function percept_memory_consp_get_mean_n (this, last)
 Get the average number of conspecifics per single time step within the whole perceptual memory stack. More...
 
elemental integer function percept_memory_predators_get_total (this)
 Get the total number of predators within the whole perceptual memory stack. More...
 
elemental real(srp) function percept_memory_predators_get_mean (this, last)
 Get the average number of predators per single time step within the whole perceptual memory stack. More...
 
elemental subroutine percept_memory_predators_mean_split (this, window, split_val, older, newer)
 Get the average number of predators per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack ('sample') is split by the split_val parameter and two means are calculated: before the split_val and after it. More...
 
elemental subroutine perception_objects_add_memory_stack (this)
 Add the various perception objects to the memory stack object. This procedure is called after all the perceptual components (light, depth food, conspecifics, predators, etc.) are collected (using set object-bound subroutines) into the perception bundle, so all the values are known and ready to be used. More...
 
subroutine perception_objects_get_all_environmental (this)
 A single umbrella subroutine to get all environmental perceptions: light, depth. This procedure invokes these calls: More...
 
subroutine perception_objects_get_all_inner (this)
 A single umbrella subroutine wrapper to get all inner perceptions: stomach, body mass, energy, age. Invokes all these procedures: More...
 
elemental subroutine, private perception_objects_init_agent (this)
 Initialise all the perception objects for the current agent. Do not fill perception objects with the real data yet. More...
 
elemental subroutine perception_objects_destroy (this, clean_memory)
 Destroy and deallocate all perception objects. More...
 
elemental real(srp) function perception_predation_risk_objective (this)
 Calculate the risk of predation as being perceived / assessed by this agent. More...
 
elemental real(srp) function predation_risk_backend (pred_count, pred_memory_mean, weight_direct)
 Simple computational backend for the risk of predation that is used in objective risk function the_neurobio::perception_predation_risk_objective() and the subjective risk function. More...
 
elemental subroutine perception_components_attention_weights_init (this, all_vals_fix, all_one, weight_light, weight_depth, weight_food_dir, weight_food_mem, weight_conspec, weight_pred_dir, weight_predator, weight_stomach, weight_bodymass, weight_energy, weight_age, weight_reprfac)
 Initialise the attention components of the emotional state to their default parameter values. Attention sets weights to individual perceptual components when the overall weighted sum is calculated. The default weights are parameters defined in COMMONDATA. More...
 
subroutine perception_components_neuronal_response_init_set (this, this_agent, param_gp_matrix_light, param_gp_matrix_depth, param_gp_matrix_food_dir, param_gp_matrix_food_mem, param_gp_matrix_conspec, param_gp_matrix_pred_dir, param_gp_matrix_predator, param_gp_matrix_stomach, param_gp_matrix_bodymass, param_gp_matrix_energy, param_gp_matrix_age, param_gp_matrix_reprfac, param_gerror_cv_light, param_gerror_cv_depth, param_gerror_cv_food_dir, param_gerror_cv_food_mem, param_gerror_cv_conspec, param_gerror_cv_pred_dir, param_gerror_cv_predator, param_gerror_cv_stomach, param_gerror_cv_bodymass, param_gerror_cv_energy, param_gerror_cv_age, param_gerror_cv_reprfac, param_gene_label_light, param_gene_label_depth, param_gene_label_food_dir, param_gene_label_food_mem, param_gene_label_conspec, param_gene_label_pred_dir, param_gene_label_predator, param_gene_label_stomach, param_gene_label_bodymass, param_gene_label_energy, param_gene_label_age, param_gene_label_reprfac)
 Set and calculate individual perceptual components for this motivational state using the neuronal response function, for this_agent. More...
 
subroutine perception_components_neuronal_response_calculate (this, this_agent, param_gp_matrix_light, param_gp_matrix_depth, param_gp_matrix_food_dir, param_gp_matrix_food_mem, param_gp_matrix_conspec, param_gp_matrix_pred_dir, param_gp_matrix_predator, param_gp_matrix_stomach, param_gp_matrix_bodymass, param_gp_matrix_energy, param_gp_matrix_age, param_gp_matrix_reprfac, param_gerror_cv_light, param_gerror_cv_depth, param_gerror_cv_food_dir, param_gerror_cv_food_mem, param_gerror_cv_conspec, param_gerror_cv_pred_dir, param_gerror_cv_predator, param_gerror_cv_stomach, param_gerror_cv_bodymass, param_gerror_cv_energy, param_gerror_cv_age, param_gerror_cv_reprfac, perception_override_light, perception_override_depth, perception_override_food_dir, perception_override_food_mem, perception_override_conspec, perception_override_pred_dir, perception_override_predator, perception_override_stomach, perception_override_bodymass, perception_override_energy, perception_override_age, perception_override_reprfac)
 Calculate individual perceptual components for this motivational state using the neuronal response function, for an this_agent. More...
 
elemental real(srp) function state_motivation_light_get (this)
 Standard "get" function for the state neuronal light effect component. More...
 
elemental real(srp) function state_motivation_depth_get (this)
 Standard "get" function for the state neuronal depth effect component. More...
 
elemental real(srp) function state_motivation_food_dir_get (this)
 Standard "get" function for the state neuronal directly seen food effect component. More...
 
elemental real(srp) function state_motivation_food_mem_get (this)
 Standard "get" function for the state neuronal food items from past memory effect component. More...
 
elemental real(srp) function state_motivation_conspec_get (this)
 Standard "get" function for the state neuronal conspecifics effect component. More...
 
elemental real(srp) function state_motivation_pred_dir_get (this)
 Standard "get" function for the state neuronal direct predation effect component. More...
 
elemental real(srp) function state_motivation_predator_get (this)
 Standard "get" function for the state neuronal predators effect component. More...
 
elemental real(srp) function state_motivation_stomach_get (this)
 Standard "get" function for the state neuronal stomach effect component. More...
 
elemental real(srp) function state_motivation_bodymass_get (this)
 Standard "get" function for the state neuronal body mass effect component. More...
 
elemental real(srp) function state_motivation_energy_get (this)
 Standard "get" function for the state neuronal energy reserves effect component. More...
 
elemental real(srp) function state_motivation_age_get (this)
 Standard "get" function for the state neuronal age effect component. More...
 
elemental real(srp) function state_motivation_reprfac_get (this)
 Standard "get" function for the state neuronal reproductive factor effect component. More...
 
elemental real(srp) function state_motivation_motivation_prim_get (this)
 Standard "get" function for the root state, get the overall primary motivation value (before modulation). More...
 
elemental real(srp) function state_motivation_motivation_get (this)
 Standard "get" function for the root state, get the overall final motivation value (after modulation). More...
 
elemental logical function state_motivation_is_dominant_get (this)
 Check if the root state is the dominant state in GOS. More...
 
elemental character(len=label_length) function state_motivation_fixed_label_get (this)
 Get the fixed label for this motivational state. Note that the label is fixed and cannot be changed. More...
 
pure subroutine state_motivation_attention_weights_transfer (this, copy_from)
 Transfer attention weights between two motivation state components. The main use of this subroutine would be to transfer attention from the actor agent's main motivation's attention component to the behaviour's GOS expectancy object. More...
 
elemental real(srp) function perception_component_maxval (this)
 Calculate the maximum value over all the perceptual components. More...
 
elemental real(srp) function state_motivation_percept_maxval (this)
 Calculate the maximum value over all the perceptual components of this motivational state component. More...
 
elemental real(srp) function state_motivation_calculate_prim (this, maxvalue)
 Calculate the level of primary motivation for this specific emotional state component. More...
 
elemental subroutine perception_component_motivation_init_zero (this)
 Initialise perception components for a motivation state object. More...
 
elemental subroutine state_hunger_zero (this)
 Init and cleanup hunger motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label. More...
 
elemental subroutine state_fear_defence_zero (this)
 Init and cleanup fear state motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label. More...
 
elemental subroutine state_reproduce_zero (this)
 Init and cleanup reproductive motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label. More...
 
elemental subroutine motivation_init_all_zero (this)
 Init the expectancy components to a zero state. More...
 
elemental subroutine motivation_reset_gos_indicators (this)
 Reset all GOS indicators for this motivation object. More...
 
elemental real(srp) function motivation_max_perception_calc (this)
 Calculate maximum value of the perception components across all motivations. More...
 
pure real(srp) function, dimension(:), allocatable motivation_return_final_as_vector (this)
 Return the vector of final motivation values for all motivational state components. More...
 
elemental real(srp) function motivation_maximum_value_motivation_finl (this)
 Calculate the maximum value of the final motivations across all motivational state components. More...
 
elemental logical function motivation_val_is_maximum_value_motivation_finl (this, test_value)
 Checks if the test value is the maximum final motivation value across all motivational state components. More...
 
elemental logical function motivation_val_is_maximum_value_motivation_finl_o (this, test_motivation)
 Checks if the test value is the maximum final motivation value across all motivational state components. More...
 
elemental subroutine motivation_primary_sum_components (this, max_val)
 Calculate the primary motivations from motivation-specific perception appraisal components. The primary motivations are motivation values before the modulation takes place. More...
 
elemental subroutine motivation_modulation_absent (this)
 Produce modulation of the primary motivations, that result in the final motivation values (_finl). In this subroutine, modulation is absent, so the final motivation values are equal to the primary motivations. More...
 
elemental subroutine, private appraisal_init_zero_cleanup_all (this)
 Initialise and cleanup all appraisal object components and sub-objects. More...
 
elemental subroutine appraisal_agent_set_dead (this)
 Set the individual to be dead. Note that this function does not deallocate the individual agent object, this may be a separate destructor function. More...
 
subroutine appraisal_perceptual_comps_motiv_neur_response_calculate (this)
 Get the perceptual components of all motivational states by passing perceptions via the neuronal response function. More...
 
elemental subroutine appraisal_primary_motivations_calculate (this, rescale_max_motivation)
 Calculate primary motivations from perceptual components of each motivation state. More...
 
subroutine appraisal_motivation_modulation_non_genetic (this, no_modulation)
 Produce modulation of the primary motivations, that result in the final motivation values (_finl). Modulation here is non-genetic and involves a fixed transformation of the primary motivation values. More...
 
subroutine appraisal_motivation_modulation_genetic (this, no_genetic_modulation)
 Produce modulation of the primary motivations, that result in the final motivation values (_finl). Modulation involves effects of such characteristics of the agent as body mass and age on the primary motivations (hunger, active and passive avoidance and reproduction) mediated by the genome effects. Here the genome determines the coefficients that set the degree of the influence of the agent's characteristics on the motivations. More...
 
elemental subroutine appraisal_add_final_motivations_memory (this)
 Add individual final emotional state components into the emotional memory stack. This is a wrapper to the the_neurobio::memory_emotional::add_to_memory method. More...
 
real(srp) function reproduce_do_probability_reproduction_calc (this, weight_baseline, allow_immature)
 Calculate the instantaneous probability of successful reproduction. More...
 
logical function reproduction_success_stochast (this, prob)
 Determine a stochastic outcome of this agent reproduction. Returns TRUE if the agent has reproduced successfully. More...
 
elemental subroutine emotional_memory_add_to_stack (this, v_hunger, v_defence_fear, v_reproduction, v_gos_label, v_gos_arousal, v_gos_repeated)
 Add emotional components into the memory stack. More...
 
elemental subroutine emotional_memory_add_gos_to_stack (this, v_gos_label, v_gos_arousal, v_gos_repeated)
 Add the current GOS label or/and arousal value and/or arousal repeat count into the emotional memory stack. More...
 
elemental subroutine emotional_memory_cleanup_stack (this)
 Cleanup and destroy the emotional memory stack. More...
 
elemental real(srp) function emotional_memory_hunger_get_mean (this, last)
 Get the average value of the hunger motivation state within the whole emotional memory stack. More...
 
elemental real(srp) function emotional_memory_actve_avoid_get_mean (this, last)
 Get the average value of the fear state motivation state within the whole emotional memory stack. More...
 
elemental real(srp) function emotional_memory_reproduct_get_mean (this, last)
 Get the average value of the reproductive motivation state within the whole emotional memory stack. More...
 
elemental real(srp) function emotional_memory_arousal_mean (this, last)
 Get the average value of the GOS arousal within the whole emotional memory stack. More...
 
subroutine gos_find_global_state (this)
 Find and set the Global Organismic State (GOS) of the agent based on the various available motivation values. The motivation values linked with the different stimuli compete with the current GOS and among themselves. More...
 
elemental subroutine, private gos_init_zero_state (this)
 Initialise GOS engine components to a zero state. The values are set to commondata::missing, commondata::unknown, string to "undefined". More...
 
elemental subroutine gos_agent_set_dead (this)
 Set the individual to be dead. Note that this function does not deallocate the individual agent object, this may be a separate destructor function. More...
 
elemental subroutine gos_reset_motivations_non_dominant (this)
 Reset all motivation states as not dominant with respect to the GOS. More...
 
elemental character(len=label_length) function gos_global_get_label (this)
 Get the current global organismic state (GOS). More...
 
elemental real(srp) function gos_get_arousal_level (this)
 Get the overall level of arousal. Arousal is the current level of the dominant motivation that has brought about the current GOS at the previous time step. More...
 
subroutine gos_attention_modulate_weights (this)
 Modulate the attention weights to suppress all perceptions alternative to the current GOS. This is done using the attention modulation interpolation curve. More...
 
elemental integer function perception_food_items_below_calculate (this)
 Calculate the number of food items in the perception object that are located below the actor agent. More...
 
elemental integer function perception_food_items_below_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of food items in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper]. More...
 
elemental real(srp) function perception_food_mass_below_calculate (this)
 Calculate the average mass of a food item from all the items in the current perception object that are below the actor agent. More...
 
elemental real(srp) function perception_food_mass_below_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the average mass of a food item from all the items in the current perception object that are below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper]. More...
 
elemental integer function perception_food_items_above_calculate (this)
 Calculate the number of food items in the perception object that are located above the actor agent. More...
 
elemental integer function perception_food_items_above_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of food items in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper]. More...
 
elemental real(srp) function perception_food_mass_above_calculate (this)
 Calculate the average mass of a food item from all the items in the current perception object that are above the actor agent. More...
 
elemental real(srp) function perception_food_mass_above_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the average mass of a food item from all the items in the current perception object that are above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper]. More...
 
elemental integer function perception_conspecifics_below_calculate (this)
 Calculate the number of conspecifics in the perception object that are located below the actor agent. More...
 
elemental integer function perception_conspecifics_above_calculate (this)
 Calculate the number of conspecifics in the perception object that are located above the actor agent. More...
 
elemental integer function perception_conspecifics_below_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of conspecifics in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper]. More...
 
elemental integer function perception_conspecifics_above_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of conspecifics in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper]. More...
 
elemental integer function perception_predator_below_calculate (this)
 Calculate the number of predators in the perception object that are located below the actor agent. More...
 
elemental integer function perception_predator_above_calculate (this)
 Calculate the number of predators in the perception object that are located above the actor agent. More...
 
elemental integer function perception_predator_below_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of predators in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper]. More...
 
elemental integer function perception_predator_above_horiz_calculate (this, hz_lower, hz_upper)
 Calculate the number of predators in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper]. More...
 
elemental real(srp) function perception_food_dist_below_calculate (this)
 Calculate the average distance to all food items in the current perception object that are below the actor agent. More...
 
elemental real(srp) function perception_food_dist_above_calculate (this)
 Calculate the average distance to all food items in the current perception object that are above the actor agent. More...
 
elemental real(srp) function perception_consp_dist_below_calculate (this)
 Calculate the average distance to all conspecifics in the current perception object that are below the actor agent. More...
 
elemental real(srp) function perception_consp_dist_above_calculate (this)
 Calculate the average distance to all conspecifics in the current perception object that are above the actor agent. More...
 
elemental real(srp) function perception_predator_dist_below_calculate (this)
 Calculate the average distance to all predators in the current perception object that are below the actor agent. More...
 
elemental real(srp) function perception_predator_dist_above_calculate (this)
 Calculate the average distance to all predators in the current perception object that are above the actor agent. More...
 
real(srp) function predator_capture_probability_calculate_spatobj (this, this_predator, attack_rate, is_freezing, time_step_model)
 Calculate the probability of attack and capture of the this agent by the predator this_predator. This probability is a function of the distance between the predator and the agent and is calculated by the predator-class-bound procedure the_environment::predator::risk_fish(). Example call: More...
 
real(srp) function predator_capture_probability_calculate_pred (this, this_predator, is_freezing, time_step_model)
 Calculate the probability of attack and capture of the this agent by the predator this_predator. This probability is a function of the distance between the predator and the agent and is calculated by the predator-class-bound procedure the_environment::predator::risk_fish(). More...
 
real(srp) function predation_capture_probability_risk_wrapper (this, is_freezing)
 Calculate the overall direct predation risk for the agent, i.e. the probability of attack and capture by the nearest predator. More...
 
elemental real(srp) function get_prop_size (this)
 Get the body size property of a polymorphic object. The object can be of the following extension of the basic the_environment::spatial class: More...
 
elemental real(srp) function get_prop_mass (this)
 Get the body mass property of a polymorphic object. The object can be of the following extension of the basic the_environment::spatial class: More...
 

Variables

character(len= *), parameter, private modname = "(THE_NEUROBIO)"
 

Detailed Description

Definition of the decision making and behavioural the architecture.

THE_BEHAVIOUR module

This module defines the neurobiological architecture of the agent, starting from perception to representation, appraisal, motivation, emotion, determination of the global organismic state, and behaviour.

Function/Subroutine Documentation

◆ percept_food_create_init()

elemental subroutine the_neurobio::percept_food_create_init ( class(percept_food), intent(inout)  this,
integer, intent(in)  maximum_number_food_items 
)

Initiate an empty food perception object with known number of components.

Parameters
[in]maximum_number_food_itemsmaximum_number_food_items Maximum number of food items in the food perception object, normally equal to the partial food resource indexing order commondata::food_select_items_index_partial.

Implementation details

Create all food items in the perception array (create is elemental procedure).

Initialise all other components of the perception object.

array

Set the initial number of food items in the perception object to the maximum number using the function percept_food_number_seen below.

Definition at line 1367 of file m_neuro.f90.

◆ percept_food_number_seen()

subroutine the_neurobio::percept_food_number_seen ( class(percept_food), intent(inout)  this,
integer, intent(in)  number_set 
)

Set the total number of food items perceived (seen) in the food perception object. Do not reallocate the perception object components with respect to this new number yet.

Parameters
[in]number_setSet the number of food items in the perception object.

Definition at line 1409 of file m_neuro.f90.

◆ percept_food_make_fill_arrays()

subroutine the_neurobio::percept_food_make_fill_arrays ( class(percept_food), intent(inout)  this,
type(food_item), dimension(:), intent(in)  items,
real(srp), dimension(:), intent(in)  dist 
)

Make the food perception object, fill it with the actual data arrays.

Note
Note that the size and allocation is set by the init method.
Parameters
[in]itemsitems an array of food items that form the perception object.
[in]distdist an array of the distances between the agent and each of the food items in the perception object.

Implementation details

First we check for non-conforming input arrays and re-init and reallocate the perception object, if needed, to the minimum value.

Report this issue to the log.

Second, fill the dynamic food perception object with the data from the input arrays. They should have conforming sizes now.

Definition at line 1421 of file m_neuro.f90.

◆ percept_food_get_count_found()

elemental integer function the_neurobio::percept_food_get_count_found ( class(percept_food), intent(in)  this)

Get the number (count) of food items seen. Trivial.

Definition at line 1460 of file m_neuro.f90.

◆ percept_food_get_meansize_found()

elemental real(srp) function the_neurobio::percept_food_get_meansize_found ( class(percept_food), intent(in)  this)

Get the average size of food items seen. Trivial.

Definition at line 1470 of file m_neuro.f90.

◆ percept_food_get_meanmass_found()

elemental real(srp) function the_neurobio::percept_food_get_meanmass_found ( class(percept_food), intent(in)  this)

Get the average mass of food items seen. Trivial.

Definition at line 1484 of file m_neuro.f90.

Here is the call graph for this function:

◆ percept_food_get_meandist_found()

elemental real(srp) function the_neurobio::percept_food_get_meandist_found ( class(percept_food), intent(in)  this)

Get the average distance to the food items seen. Trivial.

If no food items seen, we have undefined distance.

Definition at line 1500 of file m_neuro.f90.

◆ percept_food_destroy_deallocate()

elemental subroutine the_neurobio::percept_food_destroy_deallocate ( class(percept_food), intent(inout)  this)

Deallocate and delete a food perception object.

Definition at line 1514 of file m_neuro.f90.

◆ food_perception_get_visrange_objects()

subroutine the_neurobio::food_perception_get_visrange_objects ( class(perception), intent(inout)  this,
class(food_resource), intent(in)  food_resource_available,
integer, intent(in), optional  time_step_model 
)

Get available food items within the visual range of the agent, which the agent can perceive and therefore respond to. Food perception is packaged into the food perception object this%perceive_food for output.

Food perception is quite complex to implement as it requires determining individual food items within the current visual range of the agent. There are, however, potentially thousands (or millions) of food items in the food resource, each of the food items is stochastic (e.g. they have different sizes), so visual range differ for each item and each agent should determine food items in its proximity at numerous time steps of the model. This means repeating huge loops many times for each agent at each time step. This is approached by array segmentation: the perception object is obtained by partial indexing of a very limited number (=commondata::food_select_items_index_partial) of only the nearest food items, the agent's visual range is then determined for each of this nearest neighbouring food items, and finally those food items that individually fall within the visual range are included into the perception object.

Note
Note that there are three similar procedures that detect spatial objects within the visual range of the agent:

All these procedures were actually implemented using the first (the_neurobio::perception::see_food) as a template. All three implement partial indexing of the nearest spatial objects to accelerate computation of large arrays of spatial objects.

Parameters
[in]food_resource_availablefood_resource_available Global food resource object from which we select neighbouring item components that are present within the visual range of the agent.
[in]time_step_modeltime_step_model The current time step of the model.

Notable variables and parameters

dist_food_neighbours - temporary array of the distances to the neighbouring food items.

  • dist_food_index - temporary partial index vector for the distances to the neighbouring food items.
  • irradiance_agent_depth - local variable defining the irradiance (illumination) at the current depth of the agent. Needed to calculate the agent's visual range.
  • food_item_area - local variable defining the area of the food item. It is an array, area of each item in the food_resource_available and dist_foods. Needed to calculate the agent's visual range.
  • food_item_visual_range - local variable defining the visual range of the agent for detecting each of the food items (with known areas) at the agent's current depth.
  • food_items_percept_in_visrange - local sorted array of food objects that are within the visual range of the agent for output. The array should normally have the size of commondata::food_select_items_index_partial elements, but only the first food_items_n_visrange elements of it are actually within the visual range.
  • food_items_dist_sorted - temporary local sorted array of distances between the agent and each of the nearest neighbouring food items, sorted for output.
  • food_items_n_visrange - local number of elements of food_items_percept_in_visrange for output that are within he visual range of the agent.

Implementation details

Checks and preparations

Check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

Initialise index and rank values. Unititialised index arrays may result in invalid memory reference in ARRAY_INDEX (it is not safe by design).

Copy food items array component from the food_resource_available class' the_environment::food_item's array.

Warning
Note that we cannot here call
     call dist_foods%position( food_resource_available%food%location() )
as the objects are the_environment::food_item higher level than the_environment::spatial.

Step 1

First, we determine up to the maximum order (fast partial indexing) of commondata::food_select_items_index_partial neighbouring food items that are in proximity of the agent. This is done using the the_environment::spatial::neighbours() backend procedure.

Step 2

Second, we select only those items within this set, which are within the visual range of the agent under the current conditions. To do this we, first, calculate the ambient illumination/irradiance level at the depth of the agent. Done using the the_environment::spatial::illumination() procedure.

Compute the array of "prey areas" for each of the food items (whole array or neighbouring food items only).

Compute the vector of the visual ranges for detecting each of the food items by the agent.

Step 3

Now we can get the pre-output array food_items_percept_in_visrange that contains the food objects available within the visual range of the agent. Also, we count the number of such items for the output parameter food_items_n_visrange.

Also, check if the food item is available (not eaten).

Here we also log warning if no food items found, when debugging (see commondata::is_debug).

Step 4

Finally, we can now create the output food perception object, including only food items that are within the current visual range of the agent. Init (create and allocate) the food perception object (at first empty) using the the_neurobio::percept_food::init().

Fill the output perception object with the values obtained at the step 3. This is done using the the_neurobio::percept_food::make() backend procedure.

Definition at line 1553 of file m_neuro.f90.

Here is the call graph for this function:

◆ food_perception_is_seeing_food()

elemental logical function the_neurobio::food_perception_is_seeing_food ( class(perception), intent(in)  this)

Check if the agent sees any food items within its visual range.

Warning
Should be called after the see_food method as it is only an accessor get-function.
Returns
Returns TRUE if the agent has any food items in its perception object and FALSE otherwise.

Definition at line 1765 of file m_neuro.f90.

◆ food_perception_probability_capture_memory_object()

real(srp) function the_neurobio::food_perception_probability_capture_memory_object ( class(perception), intent(in)  this,
integer, intent(in), optional  last,
integer, intent(in), optional  time_step_model 
)

Calculate the probability of capture of a subjective representation of food item based on the data from the perceptual memory stack.

Parameters
[in]lastlast Limit to only this number of latest components in history.
[in]time_step_modeltime_step_model optional time step of the model, if absent, obtained from the global variable commondata::global_time_step_model_current.
Returns
Capture probability of the "subjective" food item that has the size equal to the size of the average memorised food items (from the agent's perception memory stack) and located at an average distance of food items from the memory stack.

Implementation notes

subjective_food_item_average of the type the_environment::food_item is a subjective representation of the food item object built from the memory stack data.

First, check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

Second, build the subjective food item subjective_food_item_average using the the_environment:: food_item::make() method. The location of this subjective food item coincides with the location of the agent. This allows to calculate the visibility (visual range) of the food items bat the depth of the agent.

Then the capture probability is calculated using the type-bound method the_environment::food_item::capture_probability(). Importantly, the distance towards towards the food item is explicitly provided as the average distance from the memory stack calculated by the the_neurobio::memory_perceptual::get_food_mean_dist().

Finally, we add a random Gaussian error to the above objective value. Now we have obtained the stochastic subjective value of the capture probability for this food item including a Gaussian error. There is also a strong limitation for the subjective probability to be within the range [0.0, 1.0]. See also subjective_capture_prob() for a similar Gaussian error in subjective probability.

Definition at line 1779 of file m_neuro.f90.

Here is the call graph for this function:

◆ percept_stomach_create_init()

elemental subroutine the_neurobio::percept_stomach_create_init ( class(percept_stomach), intent(inout)  this)

Initiate an empty stomach capacity perception object.

First, assign the current stomach capacity to MISSING.

Definition at line 1858 of file m_neuro.f90.

◆ percept_stomach_get_avail_capacity()

elemental real(srp) function the_neurobio::percept_stomach_get_avail_capacity ( class(percept_stomach), intent(in)  this)

Get the currently available value of the available stomach volume.

Returns
the stomach capacity currently available for new food.

Definition at line 1868 of file m_neuro.f90.

◆ percept_stomach_update_avail_capacity()

subroutine the_neurobio::percept_stomach_update_avail_capacity ( class(percept_stomach), intent(inout)  this,
real(srp), intent(in)  current_volume 
)

Set and update the currently available value of the available stomach volume.

Parameters
[in]current_volumecurrent_volume the new (updated) current volume of the stomach capacity.

Definition at line 1882 of file m_neuro.f90.

◆ percept_stomach_destroy_deallocate()

elemental subroutine the_neurobio::percept_stomach_destroy_deallocate ( class(percept_stomach), intent(inout)  this)

Destroy the stomach perception object and deallocate it.

Set the current value to commondata::missing.

Definition at line 1896 of file m_neuro.f90.

◆ percept_bodymass_create_init()

elemental subroutine the_neurobio::percept_bodymass_create_init ( class(percept_body_mass), intent(inout)  this)

Initiate an empty body mass perception object.

Assign the current body mass to commondata::missing.

Definition at line 1910 of file m_neuro.f90.

◆ percept_bodymass_get_current()

elemental real(srp) function the_neurobio::percept_bodymass_get_current ( class(percept_body_mass), intent(in)  this)

Get the current value of the body mass perception.

Returns
the current body mass value.

Definition at line 1920 of file m_neuro.f90.

◆ percept_bodymass_update_current()

subroutine the_neurobio::percept_bodymass_update_current ( class(percept_body_mass), intent(inout)  this,
real(srp), intent(in)  current 
)

Set and update the current body mass perception value.

Parameters
[in]currentcurrent the new (updated) current volume of the stomach capacity.

Definition at line 1933 of file m_neuro.f90.

◆ percept_bodymass_destroy_deallocate()

elemental subroutine the_neurobio::percept_bodymass_destroy_deallocate ( class(percept_body_mass), intent(inout)  this)

Destroy the body mass perception object and deallocate.

Set the current value to commondata::missing.

Definition at line 1947 of file m_neuro.f90.

◆ percept_energy_create_init()

elemental subroutine the_neurobio::percept_energy_create_init ( class(percept_energy), intent(inout)  this)

Initiate an empty energy perception object.

Assign the current energy to commondata::missing.

Definition at line 1961 of file m_neuro.f90.

◆ percept_energy_get_current()

elemental real(srp) function the_neurobio::percept_energy_get_current ( class(percept_energy), intent(in)  this)

Get the current value of the energy reserves.

Returns
the current energy reserve.

Definition at line 1971 of file m_neuro.f90.

◆ percept_energy_update_current()

subroutine the_neurobio::percept_energy_update_current ( class(percept_energy), intent(inout)  this,
real(srp), intent(in)  current 
)

Set and update the current energy perception value.

Parameters
[in]currentcurrent the new (updated) current energy reserves.

Definition at line 1984 of file m_neuro.f90.

◆ percept_energy_destroy_deallocate()

elemental subroutine the_neurobio::percept_energy_destroy_deallocate ( class(percept_energy), intent(inout)  this)

Destroy the energy perception object and deallocate.

Definition at line 1997 of file m_neuro.f90.

◆ percept_age_create_init()

elemental subroutine the_neurobio::percept_age_create_init ( class(percept_age), intent(inout)  this)

Initiate an empty age perception object.

Assign the current age to commondata::unknown.

Definition at line 2011 of file m_neuro.f90.

◆ percept_age_get_current()

elemental integer function the_neurobio::percept_age_get_current ( class(percept_age), intent(in)  this)

Get the current value of the age reserves.

Returns
the current age.

Definition at line 2021 of file m_neuro.f90.

◆ percept_age_update_current()

subroutine the_neurobio::percept_age_update_current ( class(percept_age), intent(inout)  this,
integer, intent(in)  current 
)

Set and update the current age perception value.

Parameters
[in]currentcurrent the new (updated) current age.

Definition at line 2034 of file m_neuro.f90.

◆ percept_age_destroy_deallocate()

elemental subroutine the_neurobio::percept_age_destroy_deallocate ( class(percept_age), intent(inout)  this)

Destroy the age perception object and deallocate it.

Set the current value to commondata::unknown.

Definition at line 2047 of file m_neuro.f90.

◆ spatial_percept_set_cid()

subroutine the_neurobio::spatial_percept_set_cid ( class(spatial_percept_component), intent(inout)  this,
integer, intent(in), optional  id 
)

Set unique id for the conspecific perception component.

Parameters
[in]idiid optional individual id number for the food item.

Implementation details

HUGE_ID is a local parameter, the maximum unique id ever possible.

Check if conspecific cid is provided and if not, set random within the huge range.

Definition at line 2061 of file m_neuro.f90.

◆ spatial_percept_get_cid()

elemental integer function the_neurobio::spatial_percept_get_cid ( class(spatial_percept_component), intent(in)  this)

Get the unique id of the food item object.

Returns
cid the individual id number of this perception component.

Definition at line 2083 of file m_neuro.f90.

◆ consp_percept_comp_create()

elemental subroutine the_neurobio::consp_percept_comp_create ( class(conspec_percept_comp), intent(inout)  this)

Create a single conspecific perception component at an undefined position with default properties.

Implementation details

We here just set an undefined location of the food object using standard interface function missing.

Set cid to UNKNOWN.

Warning
random id on create is now disabled to allow elemental function, because random are never pure. So care to set cid's elsewhere.

Then we set the conspecific size. Should it be MISSING or grand average? thisconsp_body_size = MISSING

Set the conspecific mass. The default mass of the conspecific is twice the minimum body mass. There is no upper limit on the body mass.

Note
These values are not very important as they are for default init only and will be overwritten by the actual values.

Init distance towards the conspecific, now with MISSING value.

Init the sex is male by default, it is arbitrary.

Definition at line 2100 of file m_neuro.f90.

◆ consp_percept_make()

subroutine the_neurobio::consp_percept_make ( class(conspec_percept_comp), intent(inout)  this,
type(spatial), intent(in)  location,
real(srp), intent(in), optional  size,
real(srp), intent(in), optional  mass,
real(srp), intent(in), optional  dist,
integer, intent(in), optional  cid,
logical, intent(in), optional  is_male 
)

Make a single conspecific perception component. This is a single conspecific located within the visual range of the agent.

Parameters
[in]locationLocation of the conspecific perception component, as a SPATIAL type container
[in]sizesize This is the optional conspecific body size as guessed by the agent. May or may not reflect the "true" size of the conspecific.
[in]massmass This is the optional conspecific body mass as guessed by the agent. May or may not reflect the "true" mass of the conspecific.
[in]distdist The distance towards this conspecific.
[in]cidiid Optional cid for the conspecific perception component. If not provided, set random.
[in]is_maleis_male Optional flag that sex is male.

Implementation details

We here just set the location of the food object using standard interface function position.

If individual id is provided, set it. If not, set random.

Then we set the conspecific perception component body size. Check if optional size is provided and left untouched if not.

Then we set the conspecific perception component body mass. Check if optional size is provided and left untouched if not.

Also set the distance towards the conspecific if provided. If not provided, ignore.

Definition at line 2134 of file m_neuro.f90.

◆ consp_percept_get_size()

elemental real(srp) function the_neurobio::consp_percept_get_size ( class(conspec_percept_comp), intent(in)  this)

Get the conspecific perception component body size.

Definition at line 2199 of file m_neuro.f90.

◆ consp_percept_get_mass()

elemental real(srp) function the_neurobio::consp_percept_get_mass ( class(conspec_percept_comp), intent(in)  this)

Get the conspecific perception component body mass.

Definition at line 2209 of file m_neuro.f90.

◆ consp_percept_get_dist()

elemental real(srp) function the_neurobio::consp_percept_get_dist ( class(conspec_percept_comp), intent(in)  this)

Get the conspecific perception component distance.

Definition at line 2220 of file m_neuro.f90.

◆ consp_percept_sex_is_male_get()

elemental logical function the_neurobio::consp_percept_sex_is_male_get ( class(conspec_percept_comp), intent(in)  this)

Get the conspecific perception component sex flag (male).

Definition at line 2231 of file m_neuro.f90.

◆ consp_percept_sex_is_female_get()

elemental logical function the_neurobio::consp_percept_sex_is_female_get ( class(conspec_percept_comp), intent(in)  this)

Get the conspecific perception component sex flag (female).

Definition at line 2241 of file m_neuro.f90.

◆ percept_consp_create_init()

elemental subroutine the_neurobio::percept_consp_create_init ( class(percept_conspecifics), intent(inout)  this,
integer, intent(in)  maximum_number_conspecifics 
)

Create conspecifics perception object, it is an array of conspecific perception components.

Parameters
[in]maximum_number_conspecificsmaximum_number_conspecifics The maximum number of conspecifics in the conspecifics perception object. Normally equal to the partial conspecific selection indexing order CONSP_SELECT_ITEMS_INDEX_PARTIAL.

Implementation details

Allocate the array of the conspecific perception components

And create all perception components (create is elemental).

Set the initial number of conspecifics within the visual range to the maximum number provided.

Definition at line 2256 of file m_neuro.f90.

◆ percept_consp_number_seen()

elemental subroutine the_neurobio::percept_consp_number_seen ( class(percept_conspecifics), intent(inout)  this,
integer, intent(in)  number_set 
)

Set the total number of conspecifics perceived (seen) in the conspecific perception object. But do not reallocate the conspecific perception components so far.

Parameters
[in]number_setnumber_set Set the number of conspecifics in the perception object.

Definition at line 2283 of file m_neuro.f90.

◆ percept_consp_make_fill_arrays()

pure subroutine the_neurobio::percept_consp_make_fill_arrays ( class(percept_conspecifics), intent(inout)  this,
type(conspec_percept_comp), dimension(:), intent(in)  consps 
)

Make the conspecifics perception object, fill it with the actual arrays.

Note
Note that the size and allocation is set by the init method.
Parameters
[in]conspsconsps an array of conspecific perception components that form the perception object.

PROCNAME is the procedure name for logging and debugging (with MODNAME).

Implementation details

Fill the dynamic conspecific perception object with the data from the input array.

Definition at line 2297 of file m_neuro.f90.

◆ percept_consp_get_count_seen()

elemental integer function the_neurobio::percept_consp_get_count_seen ( class(percept_conspecifics), intent(in)  this)

Get the number (count) of conspecifics seen. Trivial.

Definition at line 2316 of file m_neuro.f90.

◆ percept_consp_destroy_deallocate()

elemental subroutine the_neurobio::percept_consp_destroy_deallocate ( class(percept_conspecifics), intent(inout)  this)

Deallocate and delete a conspecific perception object.

Definition at line 2326 of file m_neuro.f90.

◆ consp_perception_get_visrange_objects()

subroutine the_neurobio::consp_perception_get_visrange_objects ( class(perception), intent(inout)  this,
class(condition), dimension(:), intent(in)  consp_agents,
integer, intent(in), optional  time_step_model 
)

Get available conspecific perception objects within the visual range of the agent, which the agent can perceive and therefore respond to.

Note
Note that there are three similar procedures that detect spatial objects within the visual range of the agent:

All these procedures were actually implemented using the first (the_neurobio::perception::see_food) as a template. All three implement partial indexing of the nearest spatial objects to accelerate computation of large arrays of spatial objects.

Parameters
[in]consp_agentsconsp_agents An array of spatial objects where we are looking for the nearest available perception objects (array).
[in]time_step_modeltime_step_model The current time step of the model.

Notable variables and parameters

  • consp_sizes - local array for the body lengths of the agents.

consp_masses - local array for the body masses of the agents.

  • consp_alive - local array of logical indicators if the agents are alive (TRUE).
  • consp_sex_is_male - local array of the sex of the conspecifics.
  • MIN_DIST_SELF - exclude self from the neighbours. Because we cannot (easily) use recursive reference to the indvidal agent class from this lower-order perception class, we have to pass some parameters of the this agent as dummy parameters to the subroutine. E.g. individual ID, if ID is incorrect or not passed, the only way to distinguish self from other agents in the neighbourhood is by different location, i.e. the distance should be non-zero. This parameter sets the tolerance limit for the difference in the distance for considering the neighbour not-self. Possibly can be equal to the parameter commondata::zero, or we can allow a higher discrepancy (this also might correct some errors).
  • agents_near - temporary array of nearby other conspecifics available to this agent.
  • dist_neighbours - temporary array of the distances to the neighbouring food items.
  • dist_index - temporary partial index vector for the distances to the neighbouring conspecifics.
  • irradiance_agent_depth - local variable defining the irradiance (illumination) at the current depth of the agent. Needed to calculate the agent's visual range.
  • sobject_area - local variable defining the conspecific area. Needed to calculate the agent's visual range.
  • sobject_visual_range - local variable defining the visual range of the agent for detecting each of the conspecifics (with known areas) at the agent's current depth.
  • sobjects_percept_in_visrange - local sorted array of conspecific perception components that are within the visual range of the agent for output. The array should normally have the size of commondata::consp_select_items_index_partial elements, but only the first sobjects_n_visrange elements of it are actually within the visual range.
  • sobjects_dist_sorted - temporary local sorted array of distances between the agent and each of the nearest neighbouring conspecifics, sorted for output.
  • consp_array_size - the size of the input arrays of object properties, local. Initially set from the size of the objects (class) array, but consp_sizes and consp_alive must have identical sizes.
  • sobjects_n_visrange - local number of elements of sobjects_percept_in_visrange for output that are within he visual range of the agent.
  • self_idx - local index of itself, needed to exclude self from debug messages and logs.

Implementation details

Checks and preparations

Initialise index and rank values. Uninitialised index arrays may result in invalid memory reference in ARRAY_INDEX (it is not safe by design).

Also zero-init self index.

Check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

Set the size for all the internal arrays, that is equal to the consp_agents objects array size.

Copy conspecifics array from the input consp_agents class into agents_near.

Get local arrays for the conspecific sizes, alive status and sex using elemental array-based accessor functions.

Step 1

First, we get, up to the maximum order (fast partial indexing) of commondata::consp_select_items_index_partial, neighbouring conspecifics that are in proximity of this agent. here we get partial index vector for the input array of objects: dist_index. The calculation backend is the_environment::spatial::neighbours().

Step 2

Second, we select only those items within this set, which are within the visual range of the agent under the current conditions. To do this we, first, calculate the ambient illumination/irradiance level at the depth of the agent. Done using the the_environment::spatial::illumination() procedure.

Compute the array of "prey areas" for each conspecific.

Compute the vector of the visual ranges for detecting each of the conspecifics by the agent.

Step 3

Now we can get the pre-output array sobjects_percept_in_visrange that contains the conspecifics available within the visual range of the agent. Also, we count their number for the output parameter sobjects_n_visrange.

Include only agents non identical to oneself. TODO: Use individual ID, but have to pass as an additional dummy parameter.

Include only alive agents.

Index of itself, will exclude from min. values.

Here we also log warning if no conspecifics found, when debugging. If the self index self_idx is non-zero, will output next from self value, usually 2.

Step 4

Finally, we can now create the output conspecific perception object, including only conspecifics that are within the current visual range of the agent. Init (create and allocate) the conspecific perception object, at first empty, using the food perception object (at first empty) using the the_neurobio::percept_conspecifics::init().

Fill the output perception object with the values obtained at the step 3. This is done using the the_neurobio::percept_conspecifics::make() backend procedure.

Definition at line 2347 of file m_neuro.f90.

Here is the call graph for this function:

◆ consp_perception_is_seeing_conspecifics()

elemental logical function the_neurobio::consp_perception_is_seeing_conspecifics ( class(perception), intent(in)  this)

Check if the agent sees any conspecifics within the visual range.

Warning
Should be called after the see_consp method as it is only an accessor get-function.
Note
There is little sense to implement this accessor procedure in the specific perception (up-level) class as every perception is not a derivative of a common class, perceptions independent, so we'll have to also implement agent-bound perception methods anyway.

Definition at line 2614 of file m_neuro.f90.

◆ spatialobj_percept_comp_create()

elemental subroutine the_neurobio::spatialobj_percept_comp_create ( class(spatialobj_percept_comp), intent(inout)  this)

Create a single arbitrary spatial object perception component at an undefined position with default properties.

Implementation notes

Just set an undefined location of the object using standard interface function the_environment::spatial::missing().

Set cid to commondata::unknown.

Warning
random id on create is now disabled to allow elemental function, because random are never pure. So care to set cid's elsewhere.

Then we set the object size to commondata::missing.

Init distance towards the object, initially also commondata::missing.

Definition at line 2630 of file m_neuro.f90.

◆ spatialobj_percept_make()

subroutine the_neurobio::spatialobj_percept_make ( class(spatialobj_percept_comp), intent(inout)  this,
type(spatial), intent(in)  location,
real(srp), intent(in), optional  size,
real(srp), intent(in), optional  dist,
integer, intent(in), optional  cid 
)

Make a single arbitrary spatial object perception component.

Parameters
[in]locationLocation of the spatial object perception component, as a the_environment::spatial type container.
[in]sizesize This is the optional object size.
[in]distdist The distance towards this object.
[in]cidiid Optional cid for the object, e.g. number within an array.

Implementation notes

Set the location of the object using standard interface function the_envirnoment::spatial::position().

If individual id is provided, set it. If not, set random.

Set the object perception component size. Only nonzero size is accepted.

Also set the distance towards the conspecific if provided. If not provided, ignore.

Definition at line 2653 of file m_neuro.f90.

◆ spatialobj_percept_get_size()

elemental real(srp) function the_neurobio::spatialobj_percept_get_size ( class(spatialobj_percept_comp), intent(in)  this)

Get an arbitrary spatial object perception component size.

Definition at line 2696 of file m_neuro.f90.

◆ spatialobj_percept_get_dist()

elemental real(srp) function the_neurobio::spatialobj_percept_get_dist ( class(spatialobj_percept_comp), intent(in)  this)

Get the distance to an arbitrary spatial object perception component.

Definition at line 2707 of file m_neuro.f90.

◆ spatialobj_percept_visibility_visual_range()

real(srp) function the_neurobio::spatialobj_percept_visibility_visual_range ( class(spatialobj_percept_comp), intent(in)  this,
real(srp), intent(in), optional  object_area,
real(srp), intent(in), optional  contrast,
integer, intent(in), optional  time_step_model 
)

Calculate the visibility range of this spatial object. Wrapper to the visual_range function. This function calculates the distance from which this object can be seen by a visual object (e.g. predator or prey).

Warning
The visual_range procedures use meter for units, this auto-converts to cm.
Cannot implement a generic function accepting also vectors of this objects as only elemental object-bound array functions are allowed by the standard. This function cannot be elemental, so passed-object dummy argument must always be scalar.
Parameters
[in]object_areaobject_area is optional area of the spatial object. This parameter can be necessary because the area can be calculated differently for different types of objects, e.g. fish using commondata::length2sidearea_fish() or food items using commondata::carea(). The default object area if the parameter is absent, uses the fish backend.
[in]contrastcontrast is an optional inherent visual contrast of the spatial object. The default contrast of all objects is defined by the commondata::preycontrast_default parameter.
[in]time_step_modeloptional time step of the model, if absent gets the current time step as defined by the value of commondata::global_time_step_model_current.
Returns
The maximum distance from which this object can be seen.

Implementation details

Check optional contrast parameter. If unset, use global commondata::preycontrast_default.

Check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

Second, check if the object area (object_area) parameter is provided. If not, calculate area from the sobj_size component assuming it is a fish. This is logical because the_neurobio::spatialobj_percept_comp class is mainly used in the perception of conspecifics and predators.

Calculate ambient illumination / irradiance at the depth of this object at the given time step.

Return visual range to see this spatial object: its visibility range.

Definition at line 2727 of file m_neuro.f90.

Here is the call graph for this function:

◆ percept_predator_create_init()

elemental subroutine the_neurobio::percept_predator_create_init ( class(percept_predator), intent(inout)  this,
integer, intent(in)  maximum_number_predators 
)

Create predator perception object, it is an array of spatial perception components.

Parameters
[in]maximum_number_predatorsmaximum_number_predators The maximum number of predators in the perception object. Normally equal to the partial predator selection indexing order PREDATOR_SELECT_ITEMS_INDEX_PARTIAL.

Implementation notes

Allocate the array of the predator's spatial perception components.

Allocate the array of the predator attack rates.

And create all perception components (create is elemental).

Fill the predator's attack rates arrays with commondata::missing values.

Set the initial number of predators within the visual range to the maximum number provided.

Definition at line 2799 of file m_neuro.f90.

◆ percept_predator_number_seen()

elemental subroutine the_neurobio::percept_predator_number_seen ( class(percept_predator), intent(inout)  this,
integer, intent(in)  number_set 
)

Set the total number of predators perceived (seen) in the predator perception object. But do not reallocate the predator perception components so far.

Parameters
[in]number_setnumber_set Set the number of predators in the perception object.

Definition at line 2834 of file m_neuro.f90.

◆ percept_predator_make_fill_arrays()

pure subroutine the_neurobio::percept_predator_make_fill_arrays ( class(percept_predator), intent(inout)  this,
type(spatialobj_percept_comp), dimension(:), intent(in)  preds,
real(srp), dimension(:), intent(in), optional  attack_rate 
)

Make the predator perception object, fill it with the actual arrays.

Note
Note that the size and allocation is set by the the_neurobio::percept_predator::init() method.
Parameters
[in]predspreds an array of predator (spatial, the_neurobio::spatialobj_percept_comp) perception components that form the perception object.

Implementation notes

Fill the dynamic conspecific perception object with the data from the input array.

The arrays for the body sizes and attack rates of the predators are set only if these arrays are present in the dummy arguments to this procedure. If they are not provided, default values are used as defined by commondata::predator_attack_rate_default parameter.

Definition at line 2849 of file m_neuro.f90.

◆ percept_predator_set_attack_rate_vector()

pure subroutine the_neurobio::percept_predator_set_attack_rate_vector ( class(percept_predator), intent(inout)  this,
real(srp), dimension(:), intent(in)  attack_rate 
)

Set an array of the attack rates for the predator perception object.

Definition at line 2881 of file m_neuro.f90.

◆ percept_predator_set_attack_rate_scalar()

pure subroutine the_neurobio::percept_predator_set_attack_rate_scalar ( class(percept_predator), intent(inout)  this,
real(srp), intent(in)  attack_rate 
)

Set an array of the attack rates for the predator perception object.

Definition at line 2892 of file m_neuro.f90.

◆ percept_predator_get_count_seen()

elemental integer function the_neurobio::percept_predator_get_count_seen ( class(percept_predator), intent(in)  this)

Get the number (count) of predators seen. Trivial.

Definition at line 2903 of file m_neuro.f90.

◆ percept_predator_destroy_deallocate()

elemental subroutine the_neurobio::percept_predator_destroy_deallocate ( class(percept_predator), intent(inout)  this)

Deallocate and delete a predator perception object.

Definition at line 2913 of file m_neuro.f90.

◆ predator_perception_get_visrange_objects()

subroutine the_neurobio::predator_perception_get_visrange_objects ( class(perception), intent(inout)  this,
class(predator), dimension(:), intent(in)  spatl_agents,
integer, intent(in), optional  time_step_model 
)

Get available predators perception objects within the visual range of the agent, which the agent can perceive and therefore respond to.

Note
Note that there are three similar procedures that detect spatial objects within the visual range of the agent:

All these procedures were actually implemented using the first (the_neurobio::perception::see_food) as a template. All three implement partial indexing of the nearest spatial objects to accelerate computation of large arrays of spatial objects.

Note
This procedure also used the conspecific perception as a template, but removed extra tests for "self".
Parameters
[in]spatl_agentsspatl_agents An array of spatial objects where we are looking for the nearest available perception objects (array).
[in]time_step_modeltime_step_model The current time step of the model.

Notable variables and parameters

  • MIN_DIST_SELF - exclude self from the neighbours. Because we cannot (easily) use recursive reference to the individual agent class from this lower-order perception class, we have to pass some parameters of the this agent as dummy parameters to the subroutine. E.g. individual ID, if ID is incorrect or not passed, the only way to distinguish self from other agents in the neighbourhood is by different location, i.e. the distance should be non-zero. This parameter sets the tolerance limit for the difference in the distance for considering the neighbour not-self. Possibly can be equal to the parameter commondata::zero, or one can allow a higher discrepancy (this also might correct some errors).

agents_near - temporary array of predators in proximity to this agent.

  • dist_neighbours - temporary array of the distances to the neighbouring predators.
  • dist_index - temporary partial index vector for the distances to the neighbouring predators.
  • irradiance_agent_depth - local variable defining the irradiance (illumination) at the current depth of the agent. Needed to calculate the agent's visual range.
  • spatl_sizes -local copy of the body lengths of the predators.
  • sobject_area - local variable defining the conspecific area. Needed to calculate the agent's visual range.
  • sobject_visual_range - local variable defining the visual range of the agent for detecting each of the predators (with known areas) at the agent's current depth.
  • sobjects_percept_in_visrange - local sorted array of the perception components that are within the visual range of the agent for output. The array should normally have the size of commondata::pred_select_items_index_partial elements, but only the first sobjects_n_visrange elements of it are actually within the visual range.
  • pred_attack_rates_in_visrange - local variable containing the attack rates of the predators that are within the visual range of the agent. The array should normally have the size of commondata::pred_select_items_index_partial elements, but only the first sobjects_n_visrange elements of it are actually within the visual range.
  • index_max_size - the size of the input arrays of object properties, local. Initially set from the size of the objects (class) array, but spatl_sizes must have identical size.
  • sobjects_n_visrange - local number of elements of food_items_percept_in_visrange for output that are within he visual range of the agent.

Implementation details

Checks and preparations

Initialise index and rank values. Uninitialised index arrays may result in invalid memory reference in ARRAY_INDEX (it is not safe by design).

Check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

This is the maximum size of the index. If the number of spatial objects is huge, we use partial indexing of the neighbours array. Then it is equal to the partial indexing commondata::pred_select_items_index_partial. However, if the number of potential neighbouring objects is smaller than the partial index size, we use full indexing. In the later case index_max_size is equal to the actual size of the neighbouring objects array.

Note
Distinguishing between the partial indexing parameter and the max size of the index makes sense only in cases where small number of neighbours can be expected. We normally have large populations of agents and huge number of food items, well exceeding the partial indexing parameter commondata::pred_select_items_index_partial. However, the number of predators can be smaller than this.

Copy predators array from the input spatl_agents class into agents_near.

Get an array of the body sizes of the predators using array-based elemental function (This now works ok in Intel Fortran 17).

Get an array of the attack rates of the predators using array-based elemental function.

Step 1

First, we get, up to the maximum order (fast partial indexing) of commondata::pred_select_items_index_partial, neighbouring predators that are in proximity of this agent. here we get partial index vector for the input array of objects: dist_index. Partial indexing is the most typical case as we have normally quite large number of agents within the population and food items within the habitat. However, this might be important for predators. Predators can be quite infrequent within the habitat, their number can be smaller than the maximum indexing parameter commondata::pred_select_items_index_partial. Hence the check is much more important here than in similar procedures for food items and conspecifics. The neighbours are computed using the the_environment::spatial::neighbours() procedure.

Here we check if the number of neighbouring agents is smaller than the partial indexing parameter commondata::pred_select_items_index_partial and if yes, do full indexing.

However, if the number of potential neighbouring objects is big, do partial indexing.

Step 2

Second, we select only those items within this set, which are within the visual range of the agent under the current conditions. To do this we, first, calculate the ambient illumination/irradiance level at the depth of the agent. Done using the the_environment::spatial::illumination() procedure.

Compute the array of "prey areas" for each conspecific. So far prey_contrast is not set, use default value commondata::individual_visual_contrast_default.

Compute the vector of the visual ranges for detecting each of the predators by the agent.

Step 3

Now we can get the pre-output array sobjects_percept_in_visrange that contains the predators available within the visual range of the agent. Also, we count their number for the output parameter sobjects_n_visrange.

The inherent attack rates of each of the predators within the visual range is also collected here into the pred_attack_rates_in_visrange array.

Here we also log warning if no objects found, when debugging. (see commondata::is_debug).

Step 4

Finally, we can now create the output conspecific perception object, including only predators that are within the current visual range of the agent. Init (create and allocate) the predator perception object, at first empty, with the_neurobio::percept_predator::init().

Fill the output perception object with the values obtained at the step 3 using the the_neurobio::percept_predator::make() method.

Note that if the commondata::agent_can_assess_predator_attack_rate parameter is set to TRUE, the agents can access and assess the inherent attack rate of the predator. This can be for example possible if the agent can assess the hunger level of the predator. The attack rate is then set from the array of the inherent attack rates of the predators pred_attack_rates_in_visrange. May need to add a predator perception error to this value, but it is not implemented yet.

Note that if the commondata::agent_can_assess_predator_attack_rate parameter is set to FALSE, the agents cannot access the inherent attack rate of the predator. The attack rate is then fixed from the default parameter commondata::predator_attack_rate_default value.

Definition at line 2937 of file m_neuro.f90.

Here is the call graph for this function:

◆ predator_perception_is_seeing_predators()

elemental logical function the_neurobio::predator_perception_is_seeing_predators ( class(perception), intent(in)  this)

Check if the agent sees any predators within the visual range.

Warning
Should be called after the the_neurobio::perception::see_food() method as it is just an accessor function.

Definition at line 3226 of file m_neuro.f90.

◆ percept_light_create_init()

elemental subroutine the_neurobio::percept_light_create_init ( class(percept_light), intent(inout)  this)

Make en empty light perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3243 of file m_neuro.f90.

◆ percept_light_get_current()

elemental real(srp) function the_neurobio::percept_light_get_current ( class(percept_light), intent(in)  this)

Get the current perception of the illumination.

Definition at line 3252 of file m_neuro.f90.

◆ percept_light_set_current()

subroutine the_neurobio::percept_light_set_current ( class(percept_light), intent(inout)  this,
integer, intent(in)  timestep,
real(srp), intent(in)  depth 
)

Set the current light level into the perception component.

Here we, calculate the ambient illumination/irradiance level at the current depth of the agent.

Note
is_stochastic logical parameter is TRUE in light_surface, that sets a stochastic illumination level at the surface and therefore also at the agent's current depth.

Definition at line 3262 of file m_neuro.f90.

◆ percept_light_destroy_deallocate()

elemental subroutine the_neurobio::percept_light_destroy_deallocate ( class(percept_light), intent(inout)  this)

Destroy / deallocate light perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3283 of file m_neuro.f90.

◆ light_perception_get_object()

subroutine the_neurobio::light_perception_get_object ( class(perception), intent(inout)  this,
integer, intent(in), optional  time_step_model 
)

Get light perception objects into the individual PERCEPTION object layer.

Parameters
[in]time_step_modeltime_step_model The current time step of the model.

Local copy of the time step of the model.

Check optional time step parameter. If unset, use global commondata::global_time_step_model_current.

Allocate and init the perception object (needed only when it is allocatable)

Definition at line 3293 of file m_neuro.f90.

◆ percept_depth_create_init()

elemental subroutine the_neurobio::percept_depth_create_init ( class(percept_depth), intent(inout)  this)

Make en empty depth perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3325 of file m_neuro.f90.

◆ percept_depth_get_current()

elemental real(srp) function the_neurobio::percept_depth_get_current ( class(percept_depth), intent(in)  this)

Get the current perception of the depth.

Definition at line 3334 of file m_neuro.f90.

◆ percept_depth_set_current()

subroutine the_neurobio::percept_depth_set_current ( class(percept_depth), intent(inout)  this,
real(srp), intent(in)  cdepth 
)

Set the current depth level into the perception component.

Definition at line 3344 of file m_neuro.f90.

◆ percept_depth_destroy_deallocate()

elemental subroutine the_neurobio::percept_depth_destroy_deallocate ( class(percept_depth), intent(inout)  this)

Destroy / deallocate depth perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3355 of file m_neuro.f90.

◆ percept_reprfac_create_init()

elemental subroutine the_neurobio::percept_reprfac_create_init ( class(percept_reprfact), intent(inout)  this)

Make en empty reproductive factor perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3369 of file m_neuro.f90.

◆ percept_reprfac_get_current()

elemental real(srp) function the_neurobio::percept_reprfac_get_current ( class(percept_reprfact), intent(in)  this)

Get the current perception of the reproductive factor.

Definition at line 3378 of file m_neuro.f90.

◆ percept_reprfac_set_current()

subroutine the_neurobio::percept_reprfac_set_current ( class(percept_reprfact), intent(inout)  this,
real(srp), intent(in)  reprfac 
)

Set the current reproductive factor level into perception component.

Definition at line 3388 of file m_neuro.f90.

◆ percept_reprfac_destroy_deallocate()

elemental subroutine the_neurobio::percept_reprfac_destroy_deallocate ( class(percept_reprfact), intent(inout)  this)

Destroy / deallocate reproductive factor perception component. Really necessary only when perception objects are all allocatable.

Definition at line 3399 of file m_neuro.f90.

◆ depth_perception_get_object()

subroutine the_neurobio::depth_perception_get_object ( class(perception), intent(inout)  this)

Get depth perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable)

Definition at line 3413 of file m_neuro.f90.

◆ stomach_perception_get_object()

subroutine the_neurobio::stomach_perception_get_object ( class(perception), intent(inout)  this)

Get the stomach capacity perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable)

Calculate the available stomach capacity, i.e. maximum stomach mass minus current stomach content, and set the value into the stomach perception object. Body mass and maxstomcap are from the CONDITION layer.

Definition at line 3427 of file m_neuro.f90.

◆ bodymass_perception_get_object()

subroutine the_neurobio::bodymass_perception_get_object ( class(perception), intent(inout)  this)

Get the body mass perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable)

Get the body mass from the individual, put it to the perception object.

Definition at line 3446 of file m_neuro.f90.

◆ energy_perception_get_object()

subroutine the_neurobio::energy_perception_get_object ( class(perception), intent(inout)  this)

Get the energy reserves perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable)

Get the energy reserves from the individual, put it to the perception object.

Definition at line 3461 of file m_neuro.f90.

◆ age_perception_get_object()

subroutine the_neurobio::age_perception_get_object ( class(perception), intent(inout)  this)

Get the age perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable).

Get the age from the individual, put it to the perception object.

Definition at line 3477 of file m_neuro.f90.

◆ repfac_perception_get_object()

subroutine the_neurobio::repfac_perception_get_object ( class(perception), intent(inout)  this)

Get the reproductive factor perception objects into the individual PERCEPTION object layer.

Allocate and init the perception object (needed only when it is allocatable).

Get the the_hormones::hormones::reproductive_factor() from the individual, put it to the perception object.

Definition at line 3493 of file m_neuro.f90.

◆ percept_memory_add_to_stack()

elemental subroutine the_neurobio::percept_memory_add_to_stack ( class(memory_perceptual), intent(inout)  this,
real(srp), intent(in)  light,
real(srp), intent(in)  depth,
real(srp), intent(in)  food,
real(srp), intent(in)  foodsize,
real(srp), intent(in)  fooddist,
integer, intent(in)  consp,
integer, intent(in)  pred,
real(srp), intent(in)  stom,
real(srp), intent(in)  bdmass,
real(srp), intent(in)  energ,
real(srp), intent(in)  reprfac 
)

Add perception components into the memory stack.

Parameters
[in]lightThe parameters of the subroutine are the actual values that are added to the perceptual memory stack arrays.

Each of the memory stack components corresponds to the respective dummy parameter. So arrays are updated at each step.

Definition at line 3513 of file m_neuro.f90.

◆ percept_memory_cleanup_stack()

elemental subroutine the_neurobio::percept_memory_cleanup_stack ( class(memory_perceptual), intent(inout)  this)

Cleanup and destroy the perceptual memory stack.

cleanup procedure uses whole array assignment.

Definition at line 3550 of file m_neuro.f90.

◆ percept_memory_food_get_total()

elemental integer function the_neurobio::percept_memory_food_get_total ( class(memory_perceptual), intent(in)  this)

Get the total number of food items within the whole perceptual memory stack.

Returns
Total count of predators in the memory stack.

Calculate the overall sum excluding missing values (masked).

Definition at line 3572 of file m_neuro.f90.

◆ percept_memory_food_get_mean_n()

elemental real(srp) function the_neurobio::percept_memory_food_get_mean_n ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average number of food items per single time step within the whole perceptual memory stack.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]lastlast Limit to only this number of latest components in history.
Returns
Mean count of food items in the memory stack.

Local copy of optional last

Implementation notes

Check if we are given the parameter requesting the latest history size. if parameter last absent or bigger than the array size, get whole stack array.

Calculate the average excluding missing values (masked) using commondata::average().

Definition at line 3595 of file m_neuro.f90.

◆ percept_memory_food_mean_n_split()

elemental subroutine the_neurobio::percept_memory_food_mean_n_split ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  window,
integer, intent(in), optional  split_val,
real(srp), intent(out)  older,
real(srp), intent(out)  newer 
)

Get the average number of food items per single time step within the perceptual memory stack, split to the first (older) and second (newer) parts. The whole memory stack ('sample') is split by the split_val parameter and two means are calculated: before the split_val and after it.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]windowwindow is the whole memory window which is analysed, if not present, the whole memory stack is used.
[in]split_valsplit_val is the split value for the separation of the older and newer averages. If not present, just splits the memory window evenly in two halves.
[out]olderolder is the output average number of the food items in the first (older) part of the memory window.
[out]newernewer is the output average number of the food items in the second (newer) part of the memory window.

Implementation details

First, check optional parameters: the memory window window and the split value split_val. If either is not provided, defaults are used.

(Also, a check is made so that a window exceeding the history stack length is reduced accordingly to the whole memory size).

A sanity check is also done, if the split value happen to exceed the window parameter, it is reduced to the default 1/2 of the window.

Second, the older and the newer output average values are calculated. Here is the illustration of the calculation:

   Such 'window' and 'split_val'
   values...

            |<----- window ----->|
   +--------+--------------------+
   +        |        :|:         +
   +--------+--------------------+
                      ^ split_val


   ... result in these means:

   +--------+---------------------+
   +        | mean for | mean for +
   +        | 'older'  | 'newer'  +
   +--------+---------------------+

Definition at line 3648 of file m_neuro.f90.

◆ percept_memory_food_get_mean_size()

elemental real(srp) function the_neurobio::percept_memory_food_get_mean_size ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average size of food item per single time step within the whole perceptual memory stack.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]lastlast Limit to only this number of latest components in history.
Returns
Mean size of food items in the memory stack.

Local copy of optional last

Implementation notes

Check if we are given the parameter requesting the latest history size. if parameter last absent or bigger than the array size, get whole stack array.

Calculate the average excluding missing values (masked) using commondata::average().

Definition at line 3739 of file m_neuro.f90.

◆ percept_memory_food_mean_size_split()

elemental subroutine the_neurobio::percept_memory_food_mean_size_split ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  window,
integer, intent(in), optional  split_val,
real(srp), intent(out)  older,
real(srp), intent(out)  newer 
)

Get the average size of food items per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack 'sample' is split by the split_val parameter and two means are calculated: before the split_val and after it.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]windowwindow is the whole memory window which is analysed, if not present, the whole memory stack is used.
[in]split_valsplit_val is the split value for the separation of the older and newer averages. If not present, just splits the memory window evenly in two halves.
[out]olderolder is the output average sizes of the food items in the first (older) part of the memory window.
[out]newernewer is the output average sizes of the food items in the second (newer) part of the memory window.

Implementation details

First, check optional parameters: the memory window window and the split value split_val. If either is not provided, defaults are used.

(Also, a check is made so that a window exceeding the history stack length is reduced accordingly to the whole memory size).

A sanity check is also done, if the split value happen to exceed the window parameter, it is reduced to the default 1/2 of the window.

Second, the older and the newer output average values are calculated. Here is the illustration of the calculation:

   Such 'window' and 'split_val'
   values...

            |<----- window ----->|
   +--------+--------------------+
   +        |        :|:         +
   +--------+--------------------+
                      ^ split_val


   ... result in these means:

   +--------+---------------------+
   +        | mean for | mean for +
   +        | 'older'  | 'newer'  +
   +--------+---------------------+

Definition at line 3792 of file m_neuro.f90.

◆ percept_memory_food_get_mean_dist()

elemental real(srp) function the_neurobio::percept_memory_food_get_mean_dist ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  last,
logical, intent(in), optional  undef_ret_null 
)

Get the average distance to food item per single time step within the whole perceptual memory stack.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]lastlast Limit to only this number of latest components in history.
[in]undef_ret_nullundef_ret_null Optional flag if undefined value with sample size should return zero mean value; if absent is set to TRUE and zero mean is returned. Note that this behaviour is the opposite of the standard commondata::average(). It is because this function is mainly for perception memory, where zero value is appropriate in absence of any food items.
Returns
Mean distance to food items in the memory stack.

Local copies of optionals

Implementation notes

Check if we are given the parameter requesting the latest history size. if parameter last absent or bigger than the array size, get whole stack array.

Calculate the average excluding missing values (masked) using commondata::average().

Definition at line 3883 of file m_neuro.f90.

◆ percept_memory_food_mean_dist_split()

elemental subroutine the_neurobio::percept_memory_food_mean_dist_split ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  window,
integer, intent(in), optional  split_val,
real(srp), intent(out)  older,
real(srp), intent(out)  newer 
)

Get the average distance to food items per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack 'sample' is split by the split_val parameter and two means are calculated: before the split_val and after it.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]windowwindow is the whole memory window which is analysed, if not present, the whole memory stack is used.
[in]split_valsplit_val is the split value for the separation of the older and newer averages. If not present, just splits the memory window evenly in two halves.
[out]olderolder is the output average distance to the food items in the first (older) part of the memory window.
[out]newernewer is the output average distance to the food items in the second (newer) part of the memory window.

Implementation details

First, check optional parameters: the memory window window and the split value split_val. If either is not provided, defaults are used.

(Also, a check is made so that a window exceeding the history stack length is reduced accordingly to the whole memory size).

A sanity check is also done, if the split value happen to exceed the window parameter, it is reduced to the default 1/2 of the window.

Second, the older and the newer output average values are calculated. Here is the illustration of the calculation:

   Such 'window' and 'split_val'
   values...

            |<----- window ----->|
   +--------+--------------------+
   +        |        :|:         +
   +--------+--------------------+
                      ^ split_val


   ... result in these means:

   +--------+---------------------+
   +        | mean for | mean for +
   +        | 'older'  | 'newer'  +
   +--------+---------------------+

Definition at line 3948 of file m_neuro.f90.

◆ percept_memory_consp_get_mean_n()

elemental real(srp) function the_neurobio::percept_memory_consp_get_mean_n ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average number of conspecifics per single time step within the whole perceptual memory stack.

Parameters
[in]lastlast Limit to only this number of latest components in history.
Returns
Mean count of conspecifics in the memory stack.

Local copy of optional last

Implementation notes

Check if we are given the parameter requesting the latest history size. if parameter last absent or bigger than the array size, get whole stack array.

Calculate the average excluding missing values (masked) using commondata::average().

Definition at line 4030 of file m_neuro.f90.

◆ percept_memory_predators_get_total()

elemental integer function the_neurobio::percept_memory_predators_get_total ( class(memory_perceptual), intent(in)  this)

Get the total number of predators within the whole perceptual memory stack.

Returns
Total count of predators in the memory stack.

Calculate the overall sum excluding missing values (masked).

Definition at line 4070 of file m_neuro.f90.

◆ percept_memory_predators_get_mean()

elemental real(srp) function the_neurobio::percept_memory_predators_get_mean ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average number of predators per single time step within the whole perceptual memory stack.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]lastlast Limit to only this number of latest components in the history.
Returns
Mean count of predators in the memory stack.

Local copy of optional last

History stack size. We determine it from the size of the actual array rather than HISTORY_SIZE_PERCEPTION for further safety.

Check if we are given the parameter requesting the latest history size. if parameter last absent or bigger than the array size, get whole stack array.

Calculate the average excluding missing values (masked).

Definition at line 4093 of file m_neuro.f90.

◆ percept_memory_predators_mean_split()

elemental subroutine the_neurobio::percept_memory_predators_mean_split ( class(memory_perceptual), intent(in)  this,
integer, intent(in), optional  window,
integer, intent(in), optional  split_val,
real(srp), intent(out)  older,
real(srp), intent(out)  newer 
)

Get the average number of predators per single time step within the perceptual memory stack, split to the first (older) and second(newer) parts. The whole memory stack ('sample') is split by the split_val parameter and two means are calculated: before the split_val and after it.

Note
There are several similar procedures with very similar implementation:
Parameters
[in]windowwindow is the whole memory window which is analysed, if not present, the whole memory stack is used.
[in]split_valsplit_val is the split value for the separation of the older and newer averages. If not present, just splits the memory window evenly in two halves.
[out]olderolder is the output average number of predators in the first (older) part of the memory window.
[out]newernewer is the output average number of predators in the second (newer) part of the memory window.

Implementation details

First, check optional parameters: the memory window window and the split value split_val. If either is not provided, defaults are used.

(Also, a check is made so that a window exceeding the history stack length is reduced accordingly to the whole memory size).

A sanity check is also done, if the split value happen to exceed the window parameter, it is reduced to the default 1/2 of the window.

Second, the older and the newer output average values are calculated. Here is the illustration of the calculation:

   Such 'window' and 'split_val'
   values...

            |<----- window ----->|
   +--------+--------------------+
   +        |        :|:         +
   +--------+--------------------+
                      ^ split_val


   ... result in these means:

   +--------+---------------------+
   +        | mean for | mean for +
   +        | 'older'  | 'newer'  +
   +--------+---------------------+

Definition at line 4145 of file m_neuro.f90.

◆ perception_objects_add_memory_stack()

elemental subroutine the_neurobio::perception_objects_add_memory_stack ( class(perception), intent(inout)  this)

Add the various perception objects to the memory stack object. This procedure is called after all the perceptual components (light, depth food, conspecifics, predators, etc.) are collected (using set object-bound subroutines) into the perception bundle, so all the values are known and ready to be used.

Now collect all perception variables into the whole memory stack object.

Definition at line 4230 of file m_neuro.f90.

◆ perception_objects_get_all_environmental()

subroutine the_neurobio::perception_objects_get_all_environmental ( class(perception), intent(inout)  this)

A single umbrella subroutine to get all environmental perceptions: light, depth. This procedure invokes these calls:

See also the_neurobio::perception::perceptions_inner().

Definition at line 4260 of file m_neuro.f90.

◆ perception_objects_get_all_inner()

subroutine the_neurobio::perception_objects_get_all_inner ( class(perception), intent(inout)  this)

A single umbrella subroutine wrapper to get all inner perceptions: stomach, body mass, energy, age. Invokes all these procedures:

See also the_neurobio::perception::perceptions_environ().

Splitting between the procedures for getting the inner and outer perceptions is for convenience only, this inner perceptions subroutine has no other parameters.

Warning
It would not be easy to implement such a wrapper for the outer perceptions because the population and various environmental objects are not yet available at this object level.
Note
Templates for outer environmental perceptions: `call proto_parentsindividual(ind)see_food( & food_resource_available = habitat_safefood, & time_step_model = 1)`

`call proto_parentsindividual(ind)see_consp( & consp_agents = proto_parentsindividual, & time_step_model = 1 )`

`call proto_parentsindividual(ind)see_pred( & spatl_agents = predators, & time_step_model = 1 ) call proto_parentsindividual(ind)feel_light(timestep) call proto_parentsindividual(ind)feel_depth()`

Definition at line 4299 of file m_neuro.f90.

◆ perception_objects_init_agent()

elemental subroutine, private the_neurobio::perception_objects_init_agent ( class(perception), intent(inout)  this)
private

Initialise all the perception objects for the current agent. Do not fill perception objects with the real data yet.

Init all perception objects within the agent.

Init and cleanup perceptual memory stack at start.

Definition at line 4313 of file m_neuro.f90.

Here is the caller graph for this function:

◆ perception_objects_destroy()

elemental subroutine the_neurobio::perception_objects_destroy ( class(perception), intent(inout)  this,
logical, intent(in), optional  clean_memory 
)

Destroy and deallocate all perception objects.

Parameters
[in]clean_memoryclean_memory Logical flag to cleanup perceptual memory stack.

Use the destroy method for all perception objects within the agent.

Init and cleanup perceptual memory stack if clean_memory is set to TRUE.

Definition at line 4335 of file m_neuro.f90.

◆ perception_predation_risk_objective()

elemental real(srp) function the_neurobio::perception_predation_risk_objective ( class(perception), intent(in)  this)

Calculate the risk of predation as being perceived / assessed by this agent.

Note
It can be placed either to PERCEPTION (which might seem more logical as it is basically the perception of predation risk by the agent) or APPRAISAL level class. Here it is in the APPRAISAL because it is actually used here. It may also be safer here as we need completed perception objects and perception memory stack to assess the objective predation risk.
Returns
assessment of the predation risk based on both the perception object and its linked perceptual memory component.

Notable parameters

WEIGHT_DIRECT is the relative weight given to the immediate perception of predators over the predators counts in the memory stack. Obtained from global parameters (commondata::predation_risk_weight_immediate).

MEM_WIND is the size of the memory window when assessing the predator risk, only this number of the latest elements from the memory stack is taken into account. So we further weight the direct threat over the background risk when making the decision.

Note
Note that we take into account the whole memory size (commondata::history_size_perception).

Implementation details

Here we analyse the predator perception object and memory stack to get the predation risk perception value, that will be fed into the sigmoid function via neuro_resp at the next step. Perception of the predation risk is the weighted sum of the total number of predators in the memory stack and the current count of predators within the visual range of the agent. The actual calculatio is done by the comon backend that is used for objective and subjective assessment of risk: the_neurobio::predation_risk_backend().

Definition at line 4372 of file m_neuro.f90.

Here is the call graph for this function:

◆ predation_risk_backend()

elemental real(srp) function the_neurobio::predation_risk_backend ( integer, intent(in)  pred_count,
real(srp), intent(in)  pred_memory_mean,
real(srp), intent(in), optional  weight_direct 
)

Simple computational backend for the risk of predation that is used in objective risk function the_neurobio::perception_predation_risk_objective() and the subjective risk function.

Parameters
[in]pred_countpred_count The number of predators in the current perception object. This is an estimate of the direct risk of predation $ r_{d} $.
[in]pred_memory_meanpred_memory_mean The mean number of predators in the memory window. The size of the memory window is not set here. It is an estimate of the indirect risk of predation $ r_{id} $.
[in]weight_directweight_direct an optional weighting factor for the immediate risk (the number of predators in the current perception object), $ \omega $. If not provided, the default value is set by the commondata::predation_risk_weight_immediate parameter.

Implementation details

First, check if the optional direct risk weighting factor $ \omega $) is provided as a dummy parameter. If not provided, use the default value that is set by the commondata::predation_risk_weight_immediate parameter.

Second, calculate the predation risk as a weighted sum of the direct risk (number of immediately perceived predators, $ r_{d} $) and indirect risk (average number of predators in the memory, $ r_{id} $):

\[ R = r_{d} \cdot \omega + r_{id} \cdot (1 - \omega) \]

Definition at line 4414 of file m_neuro.f90.

Here is the caller graph for this function:

◆ perception_components_attention_weights_init()

elemental subroutine the_neurobio::perception_components_attention_weights_init ( class(percept_components_motiv), intent(inout)  this,
real(srp), intent(in), optional  all_vals_fix,
logical, intent(in), optional  all_one,
real(srp), intent(in), optional  weight_light,
real(srp), intent(in), optional  weight_depth,
real(srp), intent(in), optional  weight_food_dir,
real(srp), intent(in), optional  weight_food_mem,
real(srp), intent(in), optional  weight_conspec,
real(srp), intent(in), optional  weight_pred_dir,
real(srp), intent(in), optional  weight_predator,
real(srp), intent(in), optional  weight_stomach,
real(srp), intent(in), optional  weight_bodymass,
real(srp), intent(in), optional  weight_energy,
real(srp), intent(in), optional  weight_age,
real(srp), intent(in), optional  weight_reprfac 
)

Initialise the attention components of the emotional state to their default parameter values. Attention sets weights to individual perceptual components when the overall weighted sum is calculated. The default weights are parameters defined in COMMONDATA.

Warning
The number and nature of the attention components is equal to the number of perceptual components, they agree 1 to 1.
Note
The perception weights weight_ parameters are not passed as an array to (a) allow for elemental function, (b) allow disabling attention components at init when weights are not provided = set to zero.
The precedence order of the parameters all_vals_fix, all_one and then weight_s, i.e. if all_vals_fix is provided, all other are ignored (see return in if-present-test blocks).
Parameters
[in]all_vals_fixall_vals_fix Optional parameter setting all weights equal to a specific fixed value.
[in]all_oneall_one Optional logical parameter setting all weights to 1.0, so the perceptual components go into unchanged form (weight=1) into the weighted sum (overall primary motivation value).
[in]weight_lightOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_depthOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_food_dirOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_food_memOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_conspecOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_pred_dirOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_predatorOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_stomachOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_bodymassOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_energyOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_ageOptional attention weights for specific perception components.
Note
If absent, set to zero.
Parameters
[in]weight_reprfacOptional attention weights for specific perception components.
Note
If absent, set to zero.

Local copies of the optional parameters.

If all_vals_fix is set, set all weights to this fixed value.

Note
We do not have option to set all values to an array to be able to have this procedure elemental.

Return after setting values.

If all_one is present and set to TRUE, init all attention weights to 1.0

Return after setting values.

Set individual attention weights

If nothing is provided, set attention weights from the dummy parameters of this procedure.

Definition at line 4477 of file m_neuro.f90.

◆ perception_components_neuronal_response_init_set()

subroutine the_neurobio::perception_components_neuronal_response_init_set ( class(percept_components_motiv), intent(inout)  this,
class(appraisal), intent(inout)  this_agent,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_light,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_depth,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_food_dir,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_food_mem,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_conspec,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_pred_dir,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_predator,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_stomach,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_bodymass,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_energy,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_age,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_reprfac,
real(srp), intent(in), optional  param_gerror_cv_light,
real(srp), intent(in), optional  param_gerror_cv_depth,
real(srp), intent(in), optional  param_gerror_cv_food_dir,
real(srp), intent(in), optional  param_gerror_cv_food_mem,
real(srp), intent(in), optional  param_gerror_cv_conspec,
real(srp), intent(in), optional  param_gerror_cv_pred_dir,
real(srp), intent(in), optional  param_gerror_cv_predator,
real(srp), intent(in), optional  param_gerror_cv_stomach,
real(srp), intent(in), optional  param_gerror_cv_bodymass,
real(srp), intent(in), optional  param_gerror_cv_energy,
real(srp), intent(in), optional  param_gerror_cv_age,
real(srp), intent(in), optional  param_gerror_cv_reprfac,
character(len=*), intent(in), optional  param_gene_label_light,
character(len=*), intent(in), optional  param_gene_label_depth,
character(len=*), intent(in), optional  param_gene_label_food_dir,
character(len=*), intent(in), optional  param_gene_label_food_mem,
character(len=*), intent(in), optional  param_gene_label_conspec,
character(len=*), intent(in), optional  param_gene_label_pred_dir,
character(len=*), intent(in), optional  param_gene_label_predator,
character(len=*), intent(in), optional  param_gene_label_stomach,
character(len=*), intent(in), optional  param_gene_label_bodymass,
character(len=*), intent(in), optional  param_gene_label_energy,
character(len=*), intent(in), optional  param_gene_label_age,
character(len=*), intent(in), optional  param_gene_label_reprfac 
)

Set and calculate individual perceptual components for this motivational state using the neuronal response function, for this_agent.

Note
The this_agent has intent [inout], so can be changed as a result of this procedure, gene labels are set for genes involved in the neuronal response.
TODO: huge parameter list- ugly coding, try to fix.
This procedure uses labelled if constructs with inline call of the neuronal response function neuro_resp, unlike this, the intent [in] procedure perception_components_neuronal_response_calculate uses inner subroutines.
Parameters
[in,out]this_agent[inout] this_agent The actor agent.
[in]param_gp_matrix_light### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_depth### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_food_dir### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_food_mem### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_conspec### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_pred_dir### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_predator### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_stomach### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_bodymass### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_energy### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_age### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_reprfac### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gerror_cv_light### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_depth### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_food_dir### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_food_mem### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_conspec### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_pred_dir### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_predator### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_stomach### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_bodymass### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_energy### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_age### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_reprfac### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light
[in]param_gerror_cv_depthcoefficient of variation for depth
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gene_label_lightparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_depthparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_food_dirparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_food_memparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_conspecparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_pred_dirparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_predatorparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_stomachparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_bodymassparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_energyparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_ageparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;
[in]param_gene_label_reprfacparam_gene_label_light label for light;
[in]param_gene_label_depthlabel for depth;
[in]param_gene_label_food_dirlabel for direct food;
[in]param_gene_label_food_memlabel for number of food items in memory;
[in]param_gene_label_conspeclabel for number of conspecific;
[in]param_gene_label_pred_dirlabel for direct predation risk;
[in]param_gene_label_predatorlabel for number of predators;
[in]param_gene_label_stomachlabel for stomach contents;
[in]param_gene_label_bodymasslabel for body mass;
[in]param_gene_label_energylabel for energy reserves;
[in]param_gene_label_agelabel for age;
[in]param_gene_label_reprfaclabel for reproductive factor;

Local copies of genotype x phenotype boolean matrices.

Local copies of the Gaussian perception error coefficients of variation.

Implementation notes

We check input boolean G x P matrices and calculate the perceptual components of this motivation state only when the boolean matrix is provided as a parameter. Also check the corresponding variance/CV and reset to deterministic (variance zero) is not provided as a dummy parameter parameter.

Warning
There should be exactly as many param_g_p_matrix and param_gerror_cv_light parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
  • calculate the perceptual component for light.
Note
The function is almost the same as in the_neurobio::appraisal but does set the label so the_agent has intent[inout].
  • calculate the perceptual component for depth.
  • calculate the perceptual component for food_dir.
  • calculate the perceptual component for food_mem.
  • calculate the perceptual component for conspec.
  • calculate the perceptual component for direct predation.
  • calculate the perceptual component for predator.
  • calculate the perceptual component for stomach.
  • calculate the perceptual component for bodymass.
  • calculate the perceptual component for energy.
  • calculate the perceptual component for age.
  • calculate the perceptual component for reproduct. factor.

Definition at line 4668 of file m_neuro.f90.

◆ perception_components_neuronal_response_calculate()

subroutine the_neurobio::perception_components_neuronal_response_calculate ( class(percept_components_motiv), intent(inout)  this,
class(appraisal), intent(in)  this_agent,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_light,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_depth,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_food_dir,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_food_mem,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_conspec,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_pred_dir,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_predator,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_stomach,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_bodymass,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_energy,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_age,
logical, dimension(max_nalleles,n_chromosomes), intent(in), optional  param_gp_matrix_reprfac,
real(srp), intent(in), optional  param_gerror_cv_light,
real(srp), intent(in), optional  param_gerror_cv_depth,
real(srp), intent(in), optional  param_gerror_cv_food_dir,
real(srp), intent(in), optional  param_gerror_cv_food_mem,
real(srp), intent(in), optional  param_gerror_cv_conspec,
real(srp), intent(in), optional  param_gerror_cv_pred_dir,
real(srp), intent(in), optional  param_gerror_cv_predator,
real(srp), intent(in), optional  param_gerror_cv_stomach,
real(srp), intent(in), optional  param_gerror_cv_bodymass,
real(srp), intent(in), optional  param_gerror_cv_energy,
real(srp), intent(in), optional  param_gerror_cv_age,
real(srp), intent(in), optional  param_gerror_cv_reprfac,
real(srp), intent(in), optional  perception_override_light,
real(srp), intent(in), optional  perception_override_depth,
real(srp), intent(in), optional  perception_override_food_dir,
real(srp), intent(in), optional  perception_override_food_mem,
real(srp), intent(in), optional  perception_override_conspec,
real(srp), intent(in), optional  perception_override_pred_dir,
real(srp), intent(in), optional  perception_override_predator,
real(srp), intent(in), optional  perception_override_stomach,
real(srp), intent(in), optional  perception_override_bodymass,
real(srp), intent(in), optional  perception_override_energy,
real(srp), intent(in), optional  perception_override_age,
real(srp), intent(in), optional  perception_override_reprfac 
)

Calculate individual perceptual components for this motivational state using the neuronal response function, for an this_agent.

Note
The this_agent has intent [in], so is unchanged as a result of this procedure. Unlike the above intent [inout] procedure, this accepts optional perception parameters that override those stored in this_agent data structure. This is done for calculating representation expectancies from possible behaviour.
Parameters
[in]this_agent[inout] this_agent The actor agent.
[in]param_gp_matrix_light### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_depth### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_food_dir### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_food_mem### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_conspec### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_pred_dir### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_predator### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_stomach### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_bodymass### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_energy### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_age### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gp_matrix_reprfac### Boolean G x P matrices ### Input structure of the fixed parameters that define the boolean genotype x phenotype matrices for each perceptual component of motivational state defined in commondata.
Warning
There should be exactly as many param_g_p_matrix parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
The dimensionality of the parameter arrays must be exactly the same as in commondata. This is why assumed shape arrays (:,:) are not used here.
Parameters
[in]param_gp_matrix_lightboolean G x P matrix for light;
[in]param_gp_matrix_depthboolean G x P matrix for depth;
[in]param_gp_matrix_food_dirboolean G x P matrix for direct food;
[in]param_gp_matrix_food_memboolean G x P matrix for number of food items in memory;
[in]param_gp_matrix_conspecboolean G x P matrix for number of conspecifics;
[in]param_gp_matrix_pred_dirboolean G x P matrix for direct predation risk;
[in]param_gp_matrix_predatorboolean G x P matrix for number of predators;
[in]param_gp_matrix_stomachboolean G x P matrix for stomach contents;
[in]param_gp_matrix_bodymassboolean G x P matrix for body mass;
[in]param_gp_matrix_energyboolean G x P matrix for energy reserves;
[in]param_gp_matrix_ageboolean G x P matrix for age;
[in]param_gp_matrix_reprfacboolean G x P matrix for reproductive factor.
[in]param_gerror_cv_light### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_depth### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_food_dir### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_food_mem### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_conspec### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_pred_dir### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_predator### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_stomach### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_bodymass### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_energy### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_age### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]param_gerror_cv_reprfac### Coefficients of variation parameters ### Input structures that define the coefficient of variation for the Gaussian perception error parameters for each of the perceptual components. Normally they are also defined in commondata.
[in]param_gerror_cv_lightcoefficient of variation for light;
[in]param_gerror_cv_depthcoefficient of variation for depth;
[in]param_gerror_cv_food_dircoefficient of variation for direct food;
[in]param_gerror_cv_food_memcoefficient of variation for number of food items in memory;
[in]param_gerror_cv_conspeccoefficient of variation for number of conspecific;
[in]param_gerror_cv_pred_dircoefficient of variation for direct predation risk;
[in]param_gerror_cv_predatorcoefficient of variation for number of predators;
[in]param_gerror_cv_stomachcoefficient of variation for stomach contents;
[in]param_gerror_cv_bodymasscoefficient of variation for body mass;
[in]param_gerror_cv_energycoefficient of variation for energy reserves;
[in]param_gerror_cv_agecoefficient of variation for age;
[in]param_gerror_cv_reprfaccoefficient of variation for reproductive factor.
[in]perception_override_light### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_depth### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_food_dir### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_food_mem### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_conspec### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_pred_dir### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_predator### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_stomach### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_bodymass### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_energy### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_age### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.
[in]perception_override_reprfac### Perception overrides (fake perceptions) ### Optional parameters to override the perception value of this_agent that will be passed through the neuronal response function. We need to be able to pass arbitrary perception values to neuronal response to assess expectancies of different behaviours for this agent.
Warning
Note that the data types of the perception override values must agree with the init_val parameter of the neuronal response function, i.e. be real. But the perception object accessor (get) function of the respective perception object (PERCEPT_) sometimes have integer type. In such cases use inline real conversion function when calling this procedure.
Parameters
[in]perception_override_lightperception override for light;
[in]perception_override_depthperception override for depth;
[in]perception_override_food_dirperception override for direct food (convert from integer);
[in]perception_override_food_memperception override for number of food items in memory;
[in]perception_override_conspecperception override for number of conspecific (convert from integer);
[in]perception_override_pred_dirperception override for direct predation risk;
[in]perception_override_predatorperception override for number of predators;
[in]perception_override_stomachperception override for stomach contents;
[in]perception_override_bodymassperception override for body mass;
[in]perception_override_energyperception override for energy reserves;
[in]perception_override_ageperception override for age (convert from integer);
[in]perception_override_reprfacperception override for reproductive factor.

Implementation notes

We check input boolean G x P matrices and calculate the perceptual components of this motivation state only when the boolean matrix is provided as a parameter. Also check the corresponding variance/CV and reset to deterministic (variance zero) is not provided as a dummy parameter parameter.

Warning
There should be exactly as many param_g_p_matrix_ and param_gerror_cv_ parameters as perceptual components for this motivation (the_neurobio::percept_components_motiv).
  • calculate the perceptual component for light.
  • calculate the perceptual component for depth.
  • calculate the perceptual component for food_dir.
  • calculate the perceptual component for food_mem.
  • calculate the perceptual component for conspec.
  • calculate the perceptual component for direct predation.
  • calculate the perceptual component for predator.
  • calculate the perceptual component for stomach.
  • calculate the perceptual component for bodymass.
  • calculate the perceptual component for energy.
  • calculate the perceptual component for age.
  • calculate the perceptual component for reproductive factor.

Definition at line 5233 of file m_neuro.f90.

◆ state_motivation_light_get()

elemental real(srp) function the_neurobio::state_motivation_light_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal light effect component.

Returns
function value: light

Definition at line 5753 of file m_neuro.f90.

◆ state_motivation_depth_get()

elemental real(srp) function the_neurobio::state_motivation_depth_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal depth effect component.

Returns
function value: depth

Definition at line 5765 of file m_neuro.f90.

◆ state_motivation_food_dir_get()

elemental real(srp) function the_neurobio::state_motivation_food_dir_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal directly seen food effect component.

Returns
function value: food

Definition at line 5777 of file m_neuro.f90.

◆ state_motivation_food_mem_get()

elemental real(srp) function the_neurobio::state_motivation_food_mem_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal food items from past memory effect component.

Returns
function value: food

Definition at line 5789 of file m_neuro.f90.

◆ state_motivation_conspec_get()

elemental real(srp) function the_neurobio::state_motivation_conspec_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal conspecifics effect component.

Returns
function value: conspecifics

Definition at line 5801 of file m_neuro.f90.

◆ state_motivation_pred_dir_get()

elemental real(srp) function the_neurobio::state_motivation_pred_dir_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal direct predation effect component.

Returns
function value: predators

Definition at line 5813 of file m_neuro.f90.

◆ state_motivation_predator_get()

elemental real(srp) function the_neurobio::state_motivation_predator_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal predators effect component.

Returns
function value: predators

Definition at line 5825 of file m_neuro.f90.

◆ state_motivation_stomach_get()

elemental real(srp) function the_neurobio::state_motivation_stomach_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal stomach effect component.

Returns
function value: stomach

Definition at line 5837 of file m_neuro.f90.

◆ state_motivation_bodymass_get()

elemental real(srp) function the_neurobio::state_motivation_bodymass_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal body mass effect component.

Returns
function value: body mass

Definition at line 5849 of file m_neuro.f90.

◆ state_motivation_energy_get()

elemental real(srp) function the_neurobio::state_motivation_energy_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal energy reserves effect component.

Returns
function value: energy reserves

Definition at line 5861 of file m_neuro.f90.

◆ state_motivation_age_get()

elemental real(srp) function the_neurobio::state_motivation_age_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal age effect component.

Returns
function value: age

Definition at line 5873 of file m_neuro.f90.

◆ state_motivation_reprfac_get()

elemental real(srp) function the_neurobio::state_motivation_reprfac_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the state neuronal reproductive factor effect component.

Returns
function value: age

Definition at line 5885 of file m_neuro.f90.

◆ state_motivation_motivation_prim_get()

elemental real(srp) function the_neurobio::state_motivation_motivation_prim_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the root state, get the overall primary motivation value (before modulation).

Returns
function value: age

Definition at line 5897 of file m_neuro.f90.

◆ state_motivation_motivation_get()

elemental real(srp) function the_neurobio::state_motivation_motivation_get ( class(state_motivation_base), intent(in)  this)

Standard "get" function for the root state, get the overall final motivation value (after modulation).

Returns
function value: age

Definition at line 5909 of file m_neuro.f90.

◆ state_motivation_is_dominant_get()

elemental logical function the_neurobio::state_motivation_is_dominant_get ( class(state_motivation_base), intent(in)  this)

Check if the root state is the dominant state in GOS.

Returns
TRUE if this motivational state is dominant in the GOS, and FALSE otherwise.

Definition at line 5920 of file m_neuro.f90.

◆ state_motivation_fixed_label_get()

elemental character(len=label_length) function the_neurobio::state_motivation_fixed_label_get ( class(state_motivation_base), intent(in)  this)

Get the fixed label for this motivational state. Note that the label is fixed and cannot be changed.

Returns
Returns the fixed label for this motivation state.

Definition at line 5934 of file m_neuro.f90.

◆ state_motivation_attention_weights_transfer()

pure subroutine the_neurobio::state_motivation_attention_weights_transfer ( class(state_motivation_base), intent(inout)  this,
class(state_motivation_base), intent(in)  copy_from 
)

Transfer attention weights between two motivation state components. The main use of this subroutine would be to transfer attention from the actor agent's main motivation's attention component to the behaviour's GOS expectancy object.

Note
Note that the procedure behaviour_root_attention_weights_transfer which does this is not using this procedure and transfers objects directly.

Definition at line 5952 of file m_neuro.f90.

◆ perception_component_maxval()

elemental real(srp) function the_neurobio::perception_component_maxval ( class(percept_components_motiv), intent(in)  this)

Calculate the maximum value over all the perceptual components.

Returns
the maximum value among all the perceptual components.

Definition at line 5976 of file m_neuro.f90.

◆ state_motivation_percept_maxval()

elemental real(srp) function the_neurobio::state_motivation_percept_maxval ( class(state_motivation_base), intent(in)  this)

Calculate the maximum value over all the perceptual components of this motivational state component.

Note
Used in motivation_primary_calc procedure.
Returns
the maximum value among all the perceptual components.

Definition at line 6002 of file m_neuro.f90.

◆ state_motivation_calculate_prim()

elemental real(srp) function the_neurobio::state_motivation_calculate_prim ( class(state_motivation_base), intent(in)  this,
real(srp), intent(in), optional  maxvalue 
)

Calculate the level of primary motivation for this specific emotional state component.

Note
Used in motivation_primary_calc procedure.
Parameters
[in]maxvalueThe maximum value across all appraisal perception components, needed to standardise and rescale the latter to the range 0:1 before they are summed up.
Returns
The value of the primary motivation for this motivation component.

Local parameters defining 0.0 and 1.0 for rescale.

Local copy of optional maxvalue.

Normally we rescale all values within the perceptual motivation components coming from the appraisal level into a [0..1] range within the agent, so that they are comparable across the motivations. To do this we need the maximum perception value over all perception objects: maxvalue. Normally maxvalue is an input parameter taking account of all motivation all state components. But if it is not provided, we calculate local maximum for this motivational component only.

Calculate the primary motivation for this motivational state by summing up (averaging) all the perceptual components for this motivation weighted by their respective attention weights; components are **rescaled* from the potential global range 0:maxvalue to the range 0:1.

Note
maxvalue should normally be the maximum value for all available motivation states, not just this. TODO: make maxvalue a structure reflecting motivational components.

Definition at line 6016 of file m_neuro.f90.

◆ perception_component_motivation_init_zero()

elemental subroutine the_neurobio::perception_component_motivation_init_zero ( class(percept_components_motiv), intent(inout)  this)

Initialise perception components for a motivation state object.

Definition at line 6099 of file m_neuro.f90.

◆ state_hunger_zero()

elemental subroutine the_neurobio::state_hunger_zero ( class(state_hunger), intent(inout)  this)

Init and cleanup hunger motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label.

Definition at line 6120 of file m_neuro.f90.

◆ state_fear_defence_zero()

elemental subroutine the_neurobio::state_fear_defence_zero ( class(state_fear_defence), intent(inout)  this)

Init and cleanup fear state motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label.

Definition at line 6152 of file m_neuro.f90.

◆ state_reproduce_zero()

elemental subroutine the_neurobio::state_reproduce_zero ( class(state_reproduce), intent(inout)  this)

Init and cleanup reproductive motivation object. The only difference from the base root STATE_MOTIVATION_BASE is that it sets unique label.

Definition at line 6184 of file m_neuro.f90.

◆ motivation_init_all_zero()

elemental subroutine the_neurobio::motivation_init_all_zero ( class(motivation), intent(inout)  this)

Init the expectancy components to a zero state.

Expectancy components.

Also set the private and fixed "number of motivational states" constant, we obviously have 3 motivations.

Definition at line 6214 of file m_neuro.f90.

◆ motivation_reset_gos_indicators()

elemental subroutine the_neurobio::motivation_reset_gos_indicators ( class(motivation), intent(inout)  this)

Reset all GOS indicators for this motivation object.

Reset dominant status to FALSE for all motivational states.

Definition at line 6230 of file m_neuro.f90.

◆ motivation_max_perception_calc()

elemental real(srp) function the_neurobio::motivation_max_perception_calc ( class(motivation), intent(in)  this)

Calculate maximum value of the perception components across all motivations.

Returns
Returns the maximum value of the perception components across all motivations.

Definition at line 6243 of file m_neuro.f90.

◆ motivation_return_final_as_vector()

pure real(srp) function, dimension(:), allocatable the_neurobio::motivation_return_final_as_vector ( class(motivation), intent(in)  this)

Return the vector of final motivation values for all motivational state components.

Definition at line 6260 of file m_neuro.f90.

◆ motivation_maximum_value_motivation_finl()

elemental real(srp) function the_neurobio::motivation_maximum_value_motivation_finl ( class(motivation), intent(in)  this)

Calculate the maximum value of the final motivations across all motivational state components.

Parameters
[in]thisthis self
Returns
Maximum final motivation.

An equivalent "manual" form not using finals function : maxvalue = maxval( [ thishungermotivation_finl, & thisfear_defencemotivation_finl, & thisreproductionmotivation_finl ] )

Definition at line 6274 of file m_neuro.f90.

◆ motivation_val_is_maximum_value_motivation_finl()

elemental logical function the_neurobio::motivation_val_is_maximum_value_motivation_finl ( class(motivation), intent(in)  this,
real(srp), intent(in)  test_value 
)

Checks if the test value is the maximum final motivation value across all motivational state components.

Note
This is a scalar form, inputs a single scalar value for testing.

Definition at line 6290 of file m_neuro.f90.

◆ motivation_val_is_maximum_value_motivation_finl_o()

elemental logical function the_neurobio::motivation_val_is_maximum_value_motivation_finl_o ( class(motivation), intent(in)  this,
class(state_motivation_base), intent(in)  test_motivation 
)

Checks if the test value is the maximum final motivation value across all motivational state components.

Note
This is object form, inputs a whole motivation state object

Definition at line 6309 of file m_neuro.f90.

◆ motivation_primary_sum_components()

elemental subroutine the_neurobio::motivation_primary_sum_components ( class(motivation), intent(inout)  this,
real(srp), intent(in), optional  max_val 
)

Calculate the primary motivations from motivation-specific perception appraisal components. The primary motivations are motivation values before the modulation takes place.

Parameters
[in]max_valmax_val optional parameter that sets the maximum perception value for rescaling all perceptions to a common currency.
Note
Needed to standardise and rescale the appraisal perception components to the range 0:1 before they are summed up.

Implementation notes

  • Rescale all values within the perceptual motivation components coming from the appraisal level into a [0..1] range within the agent, so that they are comparable across the motivations. To do this we need the maximum perception value overall perception objects: appmaxval.

If the maximum rescale perception is provided as a parameter, use it.

  • If the parameter value is not provided, calculate maximum rescale perceptions from the currently available perception components of each specific motivational state.

Definition at line 6328 of file m_neuro.f90.

◆ motivation_modulation_absent()

elemental subroutine the_neurobio::motivation_modulation_absent ( class(motivation), intent(inout)  this)

Produce modulation of the primary motivations, that result in the final motivation values (_finl). In this subroutine, modulation is absent, so the final motivation values are equal to the primary motivations.

Here the final motivations are just equal to their primary values

Definition at line 6385 of file m_neuro.f90.

◆ appraisal_init_zero_cleanup_all()

elemental subroutine, private the_neurobio::appraisal_init_zero_cleanup_all ( class(appraisal), intent(inout)  this)
private

Initialise and cleanup all appraisal object components and sub-objects.

Init and clean all motivational components.

Also cleanup the emotional memory stack.

Definition at line 6448 of file m_neuro.f90.

Here is the caller graph for this function:

◆ appraisal_agent_set_dead()

elemental subroutine the_neurobio::appraisal_agent_set_dead ( class(appraisal), intent(inout)  this)

Set the individual to be dead. Note that this function does not deallocate the individual agent object, this may be a separate destructor function.

The dies method is implemented at the following levels of the agent object hierarchy (upper overrides the lower level):

Note
This method overrides the the_genome::individual_genome::dies() method, nullifying all reproductive and neurobiological and behavioural objects.
The dies method is implemented at the gos_global to allow "cleaning" of all neurobiological objects when dies is called when performing the behaviours upwards in the object hierarchy.
  • Set the agent "dead";
  • emptify reproduction objects;
  • emptify all neurobiological objects.

Definition at line 6477 of file m_neuro.f90.

◆ appraisal_perceptual_comps_motiv_neur_response_calculate()

subroutine the_neurobio::appraisal_perceptual_comps_motiv_neur_response_calculate ( class(appraisal), intent(inout)  this)

Get the perceptual components of all motivational states by passing perceptions via the neuronal response function.

Warning
Here we use the intent[inout] procedure that does change the actor agent: sets the labels for the genes. This procedure, therefore is used only for initialisation and not in prediction.

Implementation notes

Call the neuronal response initialisation procedure the_neurobio::percept_components_motiv::motivation_components_init() for all the motivation states:

Definition at line 6493 of file m_neuro.f90.

◆ appraisal_primary_motivations_calculate()

elemental subroutine the_neurobio::appraisal_primary_motivations_calculate ( class(appraisal), intent(inout)  this,
real(srp), intent(in), optional  rescale_max_motivation 
)

Calculate primary motivations from perceptual components of each motivation state.

Parameters
[in]rescale_max_motivationrescale_max_motivation maximum motivation value for rescaling all motivational components for comparison across all motivation and perceptual components and behaviour units.

Implementation notes

  • Check if the maximum motivation value for rescale is provided as a parameter.

Check if global maximum motivation across all behaviours and perceptual components is provided for rescaling.

  • If not, use local maximum value for this behaviour only.

Finally, the primary motivation values are calculated using the the_neurobio::motivation::motivation_primary_calc() method.

Definition at line 6644 of file m_neuro.f90.

◆ appraisal_motivation_modulation_non_genetic()

subroutine the_neurobio::appraisal_motivation_modulation_non_genetic ( class(appraisal), intent(inout)  this,
logical, intent(in), optional  no_modulation 
)

Produce modulation of the primary motivations, that result in the final motivation values (_finl). Modulation here is non-genetic and involves a fixed transformation of the primary motivation values.

Parameters
[in]no_modulationno_genetic_modulation chooses if genetic modulation is calculated at all, if set to TRUE, then genetic modulation is not calculated and the final motivational values are just equal to the primary motivations.

Notable variables and parameters

AGE_ARRAY_ORDINATE is the interpolation grid ordinate. Its first and last values are set as 0.0 and 1.0, and the middle is defined by the parameter commondata::reprod_modulation_devel_w2.

      htintrpl.exe [ 7000, 8555, 11666 ] [ 0, 0.10, 1.0 ]

Implementation notes

First, initialise the final motivation values from the no modulation method the_neurobio::motivation::modulation_none().

Then check if developmental or genetic (or any other) modulation is disabled by the parameters commondata::modulation_appraisal_disable_all.

Check if no_genetic_modulation parameter is set to TRUE and if yes, return without no further processing.

Developmental modulation of reproductive factor

Reproductive factor the_hormones::hormones::reproductive_factor() is accumulated by the sex hormone level whenever the agent is growing. Such accumulation can increase motivation for reproduction. However, reproduction is not possible in young and small agents. Therefore, this procedure implements a developmental modulation of the reproductive factor: reproductive motivation the_neurobio::state_reproduce is weighted out while the agent does not reach a target body length and age. This weighting is defined by nonlinear interpolation using the abscissa array AGE_ARRAY_ABSCISSA and ordinate AGE_ARRAY_ORDINATE. Such weighting, thus, allows non-zero reproductive motivation only when the agent reaches the age exceeding the first abscissa value AGE_ARRAY_ABSCISSA, age > L/2, as here the weighting factor exceeds zero. Furthermore, when the age of the agent exceeds the last value of AGE_ARRAY_ABSCISSA, the weighting factor is equal to 1.0, so reproductive motivation is not limited any more.

Interpolation plots can be saved in the debug mode using this plotting command: commondata::debug_interpolate_plot_save().

Definition at line 6686 of file m_neuro.f90.

◆ appraisal_motivation_modulation_genetic()

subroutine the_neurobio::appraisal_motivation_modulation_genetic ( class(appraisal), intent(inout)  this,
logical, intent(in), optional  no_genetic_modulation 
)

Produce modulation of the primary motivations, that result in the final motivation values (_finl). Modulation involves effects of such characteristics of the agent as body mass and age on the primary motivations (hunger, active and passive avoidance and reproduction) mediated by the genome effects. Here the genome determines the coefficients that set the degree of the influence of the agent's characteristics on the motivations.

Parameters
[in]no_genetic_modulationno_genetic_modulation chooses if genetic modulation is calculated at all, if set to TRUE, then genetic modulation is not calculated and the final motivational values are just equal to the primary motivations.

Implementation notes

First, initialise the final motivation values from the no modulation method the_neurobio::motivation::modulation_none().

Then check if developmental or genetic (or any other) modulation is disabled by the parameters commondata::modulation_appraisal_disable_all.

Check if no_genetic_modulation parameter is set to TRUE and if yes, return without no further processing.

Sex modulation of the reproduction motivation state for male.

  • First, use the sigmoid function and genome to set the phenotypic value of the gamma modulation coefficient that mediates the effect of the agent's sex on reproductive motivation.
  • Second, add the modulation factor to the actual motivation value. If the agent is male, then its reproductive motivation is increased by an additive component that is an asymptotic() function of the genome based modulation_gamma parameter. The maximum modulatory increase ever possible is the double value of the raw primary motivation.

Sex modulation of the reproduction motivation state for female.

  • First, use the sigmoid function and genome to set the phenotypic value of the gamma modulation coefficient that mediates the effect of the agent's sex on reproductive motivation.
  • Second, add the modulation factor to the actual motivation value. If the agent is male, then its reproductive motivation is increased by an additive component that is an asymptotic() function of the genome based modulation_gamma parameter. The maximum modulatory increase ever possible is the double value of the raw primary motivation.

The values are logged in thedebug mode.

Definition at line 6786 of file m_neuro.f90.

Here is the call graph for this function:

◆ appraisal_add_final_motivations_memory()

elemental subroutine the_neurobio::appraisal_add_final_motivations_memory ( class(appraisal), intent(inout)  this)

Add individual final emotional state components into the emotional memory stack. This is a wrapper to the the_neurobio::memory_emotional::add_to_memory method.

Definition at line 6934 of file m_neuro.f90.

◆ reproduce_do_probability_reproduction_calc()

real(srp) function the_neurobio::reproduce_do_probability_reproduction_calc ( class(appraisal), intent(in)  this,
real(srp), intent(in), optional  weight_baseline,
logical, intent(in), optional  allow_immature 
)

Calculate the instantaneous probability of successful reproduction.

Note
Note that this function is bound to class the_neurobio::appraisal rather than the_neurobio::reproduce. Probability of successful reproduction is a dynamic property of this agent that depends on the nearby conspecifics and their sex and size/mass.
Parameters
[in]weight_baselineweight_baseline is the weighting factor for the baseline probability of successful reproduction $ \varphi $ (see details below).
[in]allow_immatureallow_immature a logical switch that allows calculation (non-zero probability) if the agent is not ready to reproduction as determined by the the_body::reproduction::is_ready_reproduce() method. Normally, immature agents (for which this method returns FALSE) have zero probability of reproduction. The default is FALSE, i.e. not to allow reproduction to immature agents.
Returns
instantaneous probability of successful reproduction.

Implementation details

The probability of successful reproduction depends on the number of conspecifics of the same and the opposite sex within the this agent's visual range. So the starting point here is the number of conspecifics within the current conspecifics perception object.

First, determine if the hormonal system of the agent is ready for reproduction using the_body::reproduction::is_ready_reproduce().

If this agent is not ready to reproduce, a zero probability of reproduction returned. However, if the optional parameter allow_immature is explicitly set to TRUE, this check is not done and the probability of reproduction is calculated as follows.

Second, determine if there are any conspecifics in the perception, if there are no, reproduction is impossible. Return straight away zero probability in such a case.

Second, extract the number of conspecifics n_conspecifics_perception from the perception object.

Also, initialise the number of same- and opposite-sex conspecifics (integer counters) as well as the total mass of same-sex conspecifics (real) to zero.

Third, determine how many of the conspecifics in perception have the same and the opposite sex. Calculate total mass of same sex conspecifics.

Additionally, check if the number of opposite sex agents is zero. in such a case zero probability of reproduction is obviously returned.

Fourth, calculate the baseline probability of reproduction. This probability is proportional to the proportions of the same- and opposite-sex agents within the visual range.

\[ p_{0} = \frac{N_{os}}{1+N_{ss}} ; 0 \leq p_{0}\leq 1, \]

where $ N_{os} $ is the number of the opposite-sex agents, $ N_{ss} $ is the number of same-sex agents. We also adjust the baseline probability of successful reproduction by a parameter factor $ \varphi $, so that this probability never reaches 1:

\[ p_{0} = \frac{N_{os}}{1+N_{ss}} \cdot \varphi \]

For example, if there is only one agent of the opposite sex and no same-sex in proximity the baseline probability of reproduction is 1/(1+0) = 1.0 (note that the this agent also adds to the same-sex count, hence "1+..."). If there are 3 opposite-sex agents and 3 same-sex agents, the baseline probability is calculated as 3/(1+3) = 0.75. This doesn't take account of the $ \varphi $ multiplier factor.

Fifth, to get the final successful reproduction probability, the baseline value $ p_{0} $ is multiplied by a function $ \Phi $ that depends on the relative body mass of the this agent with respect to all the same-sex agents in proximity.

\[ p_{rep} = p_{0} \cdot \Phi(\Delta \overline{m_{i}}), 0 \leq p_{rep} \leq 1 , \]

where $ p_{rep} $ is the final probability of successful reproduction. This is done to model direct within-sex competition for mates. Therefore, if the this agent is smaller than all the other same-sex agents here, the probability of successful reproduction significantly reduces. On the other hand, if the agent is larger than all the others, this probability would increase. The form of the $ \Phi $ function is calculated on the bases of the ratio of the this agent body mass to the average body mass of all same sex agents within the visual range:

\[ \Delta \overline{m_{i}} = \frac{ M }{ \overline{m_{i}} } . \]

Note that if there are no same sex agents (i.e. intra-sexual competition is absent) the probability of reproduction takes the baseline value $ p_{0} $ :

\[ p_{rep} = p_{0} . \]

No debug interpolation plot is produced in such a degenerate case.

The $ \Phi(\Delta \overline{m_{i}}) $ function itself is obtained from a nonlinear interpolation of grid values defined by the parameter arrays commondata::probability_reproduction_delta_mass_abscissa and commondata::probability_reproduction_delta_mass_ordinate.

So the final reproduction probability value is obtained by multiplication of the baseline value by the $ \Phi $ function. The final probability of reproduction value is limited to lie within the range $ 0 \leq p_{rep} \leq 1 \cdot \varphi $.

Interpolation plots can be saved in the debug mode using this plotting command: commondata::debug_interpolate_plot_save().

Definition at line 6950 of file m_neuro.f90.

◆ reproduction_success_stochast()

logical function the_neurobio::reproduction_success_stochast ( class(appraisal), intent(in)  this,
real(srp), intent(in), optional  prob 
)

Determine a stochastic outcome of this agent reproduction. Returns TRUE if the agent has reproduced successfully.

Parameters
[in]proboptional fixed probability of reproduction to override.
Returns
TRUE if reproduction is successful.
Warning
This function cannot be made elemental/pure due to random number call.

Implementation details

Check if prob is present, if not, the probability of reproduction is calculated based on the perception objects of the actor agent this using the probability_reproduction() method.

Determine the reproduction success stochastically based on the probability of repriduction (prob_here) value.

Definition at line 7194 of file m_neuro.f90.

◆ emotional_memory_add_to_stack()

elemental subroutine the_neurobio::emotional_memory_add_to_stack ( class(memory_emotional), intent(inout)  this,
real(srp), intent(in)  v_hunger,
real(srp), intent(in)  v_defence_fear,
real(srp), intent(in)  v_reproduction,
character(*), intent(in), optional  v_gos_label,
real(srp), intent(in), optional  v_gos_arousal,
integer, intent(in), optional  v_gos_repeated 
)

Add emotional components into the memory stack.

Parameters
[in]v_hungerThe parameters of the subroutine are the actual values that are added to the emotional memory stack arrays.
[in]v_hungervalue for hunger;
[in]v_defence_fearvalue for fear state;
[in]v_reproductionvalue for reproduction;
[in]v_gos_labelvalue for GOS label;
[in]v_gos_arousalvalue for GOS arousal value;
[in]v_gos_repeatedvalue for repeated counter for GOS.

Each of the memory stack components corresponds to the respective dummy parameter. These arrays are updated at each step (mandatory procedure arguments):

  • v_hunger
  • v_defence_fear
  • v_reproduction

However, GOS parameters are optional and updated only if provided for invocation of this method (optional arguments):

  • v_gos_label;
  • v_gos_arousal;
  • v_gos_repeated.

Definition at line 7227 of file m_neuro.f90.

◆ emotional_memory_add_gos_to_stack()

elemental subroutine the_neurobio::emotional_memory_add_gos_to_stack ( class(memory_emotional), intent(inout)  this,
character(*), intent(in), optional  v_gos_label,
real(srp), intent(in), optional  v_gos_arousal,
integer, intent(in), optional  v_gos_repeated 
)

Add the current GOS label or/and arousal value and/or arousal repeat count into the emotional memory stack.

Parameters
[in]v_gos_labelv_gos_label Text label for the current GOS.
[in]v_gos_arousalv_gos_arousal The maximum motivation (arousal) value for the current GOS.
[in]v_gos_repeatedv_gos_repeated

Implementation notes

GOS label is added to the memory stack.

GOS arousal is added to the memory stack.

The GOS repeated counter (gos_repeated) is added to the memory stack.

Definition at line 7275 of file m_neuro.f90.

◆ emotional_memory_cleanup_stack()

elemental subroutine the_neurobio::emotional_memory_cleanup_stack ( class(memory_emotional), intent(inout)  this)

Cleanup and destroy the emotional memory stack.

cleanup procedure uses whole array assignment to the commondata::missing values.

Definition at line 7302 of file m_neuro.f90.

◆ emotional_memory_hunger_get_mean()

elemental real(srp) function the_neurobio::emotional_memory_hunger_get_mean ( class(memory_emotional), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average value of the hunger motivation state within the whole emotional memory stack.

Returns
Total count of predators in the memory stack.
Parameters
[in]lastlast Limit to only this number of latest components in the history.

Implementation notes

  • Check if we are given the parameter requesting the latest history size. If the last parameter is absent or bigger than the array size, get the whole stack array.

Calculate the average excluding missing values (masked) within the subarray of interest.

Definition at line 7320 of file m_neuro.f90.

◆ emotional_memory_actve_avoid_get_mean()

elemental real(srp) function the_neurobio::emotional_memory_actve_avoid_get_mean ( class(memory_emotional), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average value of the fear state motivation state within the whole emotional memory stack.

Returns
Total count of predators in the memory stack.
Parameters
[in]lastlast Limit to only this number of latest components in the history.

Implementation notes

  • Check if we are given the parameter requesting the latest history size. if the last parameter is absent or bigger than the array size, get the whole stack array.

Calculate the average excluding missing values (masked) within the subarray of interest.

Definition at line 7361 of file m_neuro.f90.

◆ emotional_memory_reproduct_get_mean()

elemental real(srp) function the_neurobio::emotional_memory_reproduct_get_mean ( class(memory_emotional), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average value of the reproductive motivation state within the whole emotional memory stack.

Returns
Total count of predators in the memory stack.
Parameters
[in]lastlast Limit to only this number of latest components in the history.

Implementation notes

  • Check if we are given the parameter requesting the latest history size. if the last parameter is absent or bigger than the array size, get the whole stack array.

Calculate the average excluding missing values (masked) within the subarray of interest.

Definition at line 7403 of file m_neuro.f90.

◆ emotional_memory_arousal_mean()

elemental real(srp) function the_neurobio::emotional_memory_arousal_mean ( class(memory_emotional), intent(in)  this,
integer, intent(in), optional  last 
)

Get the average value of the GOS arousal within the whole emotional memory stack.

Returns
Total count of predators in the memory stack.
Parameters
[in]lastlast Limit to only this number of latest components in the history.

Implementation notes

  • Check if we are given the parameter requesting the latest history size. If the last parameter is absent or bigger than the array size, get the whole stack array.

Calculate the average excluding missing values (masked) within the subarray of interest.

Definition at line 7445 of file m_neuro.f90.

◆ gos_find_global_state()

subroutine the_neurobio::gos_find_global_state ( class(gos_global), intent(inout)  this)

Find and set the Global Organismic State (GOS) of the agent based on the various available motivation values. The motivation values linked with the different stimuli compete with the current GOS and among themselves.

General principle

The GOS competition threshold is a function of the current GOS arousal level: if it is very low, it would be very difficult to switch to a different GOS. However, if the current GOS has a high arousal, then switching to a competing motivation is relatively easy: a very small motivational surplus is enough for winning the competition with the current GOS.

Note
GOS generation is a little changed in the new generation model.
  1. We try to avoid constant switching of the GOS by requiring that the difference between motivational components should exceed some threshold value, if it does not, retain old GOS. So minor fluctuations in the stimulus field are ignored. Threshold is a dynamic parameter, so can also be zero.
  2. The threshold is inversely related to the absolute value of the motivations compared, when the motivations are low, the threshold is big, when their values are approaching 1, the threshold approaches zero. So motivations have relatively little effects.

Implementation details

Notable class data members

Public attribute of the GOS_GLOBAL class: gos_arousal keeps the current level of the GOS arousal (A, see below). If GOS does switch as a result of competition with the other motivational states, it gets the value of its winning (maximum) motivation, if GOS does not switch as a result of competition, the gos_arousal value dissipates spontaneously to a lower value and the gos_repeated attribute of GOS_GLOBAL gets the successive number of repetitions of the same out of competition GOS state.

Notable local variables

Local variable: arousal_new is the maximum level of motivation among all new incoming motivations A. It is this motivation value that competes with the current GOS arousal value (G the gos_arousal public attribute of the GOS_GLOBAL class).

Local variable gos_dthreshold is a dynamic threshold factor for GOS change $ \Delta $ (see below). It determines the threshold that a new competing motivation has to exceed to win the competition with the previous (and still current up to this point) motivation.

GOS competition

The GOS competition threshold is a function of the current GOS arousal level $ G $: if it is very low, we need a relatively high competing motivation to win competition, if it is high then very small difference is enough. The global organismic state will switch to a competing state only if its maximum motivation $ A $ exceeds the current GOS's arousal level $ G $ by more than $ \Delta $ units of $ G $:

\[ A - G > \Delta \cdot G . \]

Here the $ \Delta $ threshold factor is set by a nonparametric function that is calculated from nonlinear interpolation of the grid values:

So if the agent currently has a low GOS arousal G=0.1, it requires a competing state to be at least A=0.155 to win (with $ \Delta $ =0.55, 0.155 = 0.1 + 0.1 * 0.55). However, if the agent has a high GOS motivation G=0.8, almost any exceeding motivation (>0.808) will win. The actual value of the nonparametric interpolation function are obtained by nonlinear interpolation of the grid values defined by the MOTIVATION_COMPET_THRESHOLD_CURVE_ parameter arrays.

Note
In this implementation, the exact type of the competing motivation is not considered in the GOS competition procedure. For example, The current hunger GOS competes with all motivations, including itself. Consider an agent that is starving and has a high level of hunger GOS. At each step this hunger competes with all other motivations, including hunger. If hunger continues to increase, still, at many time steps, the new level of hunger can outcompete the current hunger and GOS switches from hunger to ... hunger. There can also be situations when current GOS hunger wins competition from all other motivations several times, dissipates and then is outcompeted by hunger again, that would lead to a relatively long streak of the same GOS. This mechanism would preclude switching out of (and losing) a continuously high but still appropriate motivational state.

The interpolation plots are saved in the debug mode to a disk file using an external command by the commondata::debug_interpolate_plot_save() procedure.

Warning
Enabling plotting can produce a huge number of plots and should normally be disabled.

Once the dynamic threshold is calculated, we can compare each of the competing motivation levels with the current arousal. If the maximum value of these motivations exceeds the current arousal by more than the threshold $ \Delta $ factor, the GOS switches to the new motivation. If not, we are still left with the previous GOS.

Threshold not exceeded

If the maximum competing motivation does not exceed the threshold, we are left with the old GOS. However, we reduce the current arousal spontaneously using a simple linear or some non-linear dissipation pattern using the %gos_repeated parameter that sets the number of repeated occurrences of the same (current) GOS. First, increment GOS repeat counter.

And spontaneously decrease, dissipate, the current arousal level. Spontaneous dissipation of arousal is implemented by multiplying the current level by a factor within the range [0.0..1.0] that can depend on the number of times this GOS is repeated.

Note
Note that the dissipation function is local to this procedure. arousal_decrease_factor_fixed = fixed value arousal_decrease_factor_nonpar = nonlinear, nonparametric, based on nonlinear interpolation. @plot aha_gos_arousal_dissipation.svg
Can use either arousal_decrease_factor_fixed or arousal_decrease_factor_nonpar.

Threshold is exceeded

If the maximum competing motivation exceeds the threshold, we get to a new GOS. That is, the highest among the competing motivations defines the new GOS.

Note
Note that gos_repeated is initialised to 1.0 at gos_reset.

Check <strong>hunger</strong>

Reset all motivations to non-dominant.

Set new GOS for hunger...

Check <strong>fear_defence</strong>

Reset all motivations to non-dominant.

Set new GOS for fear_defence...

Check <strong>reproduction</strong>

Reset all motivations to non-dominant.

Set new GOS for reproduction...

Other finalising procedures

Add the current GOS parameters to the emotional memory stack

Note
Note that the memory stack arrays are defined in APPRAISAL and cleaned/init in init_appraisal

Finally recalculate the attention weights for all the states' perception components using attention_modulate(). The dominant GOS state will now get its default attention weights whereas all non-dominant states will get modulated values, i.e. values recalculated from a non-linear interpolation based attention modulation curve.

Definition at line 7508 of file m_neuro.f90.

Here is the call graph for this function:

◆ gos_init_zero_state()

elemental subroutine, private the_neurobio::gos_init_zero_state ( class(gos_global), intent(inout)  this)
private

Initialise GOS engine components to a zero state. The values are set to commondata::missing, commondata::unknown, string to "undefined".

Note
the GOS arousal value is initialised to commondata::missing, which is a big negative value. Therefore, any competing motivation initially wins in the the_neurobio::gos_find_global_state() procedure. There seems to be no sense initialising arousal to 0.0.

Definition at line 7787 of file m_neuro.f90.

Here is the caller graph for this function:

◆ gos_agent_set_dead()

elemental subroutine the_neurobio::gos_agent_set_dead ( class(gos_global), intent(inout)  this)

Set the individual to be dead. Note that this function does not deallocate the individual agent object, this may be a separate destructor function.

The dies method is implemented at the following levels of the agent object hierarchy (upper overrides the lower level):

Note
This method overrides the the_genome::individual_genome::dies() method, nullifying all reproductive and neurobiological and behavioural objects.
The dies method is implemented at the gos_global to allow "cleaning" of all neurobiological objects when dies is called when performing the behaviours upwards in the object hierarchy.
  • Set the agent "dead";
  • emptify reproduction objects;
  • emptify all neurobiological objects.

Definition at line 7818 of file m_neuro.f90.

◆ gos_reset_motivations_non_dominant()

elemental subroutine the_neurobio::gos_reset_motivations_non_dominant ( class(gos_global), intent(inout)  this)

Reset all motivation states as not dominant with respect to the GOS.

Note
This subroutine is used in the_neurobio::gos_find_global_state().

Implementation notes

Reset dominant status to FALSE for all motivational states calling the the_neurobio::motivation::gos_ind_reset().

Also reset the number of GOS repeated occurrences to 1.

Definition at line 7832 of file m_neuro.f90.

◆ gos_global_get_label()

elemental character(len=label_length) function the_neurobio::gos_global_get_label ( class(gos_global), intent(in)  this)

Get the current global organismic state (GOS).

Returns
Global organismic state label.

Check which of the currently implemented motivational state components (STATE_) has the dominant flag. Can call motivation-type-bound function %is_dominant().

Note
Only one component can be "dominant".

Definition at line 7847 of file m_neuro.f90.

◆ gos_get_arousal_level()

elemental real(srp) function the_neurobio::gos_get_arousal_level ( class(gos_global), intent(in)  this)

Get the overall level of arousal. Arousal is the current level of the dominant motivation that has brought about the current GOS at the previous time step.

Definition at line 7882 of file m_neuro.f90.

◆ gos_attention_modulate_weights()

subroutine the_neurobio::gos_attention_modulate_weights ( class(gos_global), intent(inout)  this)

Modulate the attention weights to suppress all perceptions alternative to the current GOS. This is done using the attention modulation interpolation curve.

Warning
This subroutine is called from within gos_find() and should normally not be called separately.

Implementation details

Overview

Each of the perceptions is weighted by an attention factor. The attention factor is in turn modulated (weighted) by the current Global Organismic State (GOS). When the current arousal is relatively high, all irrelevant perceptions are effectively filtered out (weighted by near-zero) and do not (largely) contribute to the GOS at the next time step. For example, the agent just does not "see" food items when it is in a high fear state.

Thus, perception is weighted by the attention suppression factor separately for each motivational (emotional) state according to the scheme below:

Attention suppression

Also see Cognitive architecture section.

Specific details

First, we calculate the attention weight given to all non-dominant perceptions via nonlinear interpolation. Interpolation is based on the grid defined by two parameters: ATTENTION_MODULATION_CURVE_ABSCISSA and ATTENTION_MODULATION_CURVE_ORDINATE.

Note
Interpolation plot can be produced using this command, assuming the plotting tools are installed on the system.
          htintrpl.exe [0.0, 0.3, 0.5, 1.0] [1.0, 0.98, 0.9, 0.0] [2]

Interpolation plots can be saved in the debug mode using this plotting command: commondata::debug_interpolate_plot_save().

Warning
Involves huge number of plots, should normally be disabled.

Second, we reset the attention weights for the dominant GOS state to their default parameter values whereas for all other states, to the recalculated percept_w modulated/weighted value. The the_neurobio::percept_components_motiv::attention_init() method is used to adjust the attention weights.

Definition at line 7899 of file m_neuro.f90.

◆ perception_food_items_below_calculate()

elemental integer function the_neurobio::perception_food_items_below_calculate ( class(perception), intent(in)  this)

Calculate the number of food items in the perception object that are located below the actor agent.

Returns
The number of food items within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculate food items within the perception that are below the agent.

Definition at line 8112 of file m_neuro.f90.

◆ perception_food_items_below_horiz_calculate()

elemental integer function the_neurobio::perception_food_items_below_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of food items in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of food items within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Loop through the food items within the later and calculate the total number.

Definition at line 8144 of file m_neuro.f90.

◆ perception_food_mass_below_calculate()

elemental real(srp) function the_neurobio::perception_food_mass_below_calculate ( class(perception), intent(in)  this)

Calculate the average mass of a food item from all the items in the current perception object that are below the actor agent.

Returns
Average mass of food items within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the return average mass and the counter for calculating the average both to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average mass of the food items below is done by concurrent looping through the food items within the perception object.

This is done by checking the condition:

   if ( food_item .below. this ) ...
Note
This uses the user defined operator .below. that is implemented in the_environment module.

The average mass of the food items is calculated using the food items mass values returned by the function get_mass (the_environment::food_item::get_mass()) .

Final average value is calculated, obviously, by division of the total mass by the total count. In case the count is zero, also return zero mean.

Definition at line 8189 of file m_neuro.f90.

◆ perception_food_mass_below_horiz_calculate()

elemental real(srp) function the_neurobio::perception_food_mass_below_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the average mass of a food item from all the items in the current perception object that are below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
Average mass of food items within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the return average mass and the counter for calculating the average both to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average mass of the food items below is done by concurrent looping through the food items within the perception object.

The average mass of the food items is calculated using the food items mass values returned by the function get_mass (the_environment::food_item::get_mass()) .

Final average value is calculated, obviously, by division of the total mass by the total count. In case the count is zero, also return zero mean.

Definition at line 8251 of file m_neuro.f90.

◆ perception_food_items_above_calculate()

elemental integer function the_neurobio::perception_food_items_above_calculate ( class(perception), intent(in)  this)

Calculate the number of food items in the perception object that are located above the actor agent.

Returns
The number of food items within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculate food items within the perception that are above the agent.

Definition at line 8313 of file m_neuro.f90.

◆ perception_food_items_above_horiz_calculate()

elemental integer function the_neurobio::perception_food_items_above_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of food items in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of food items within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Loop through the food items within the later and calculate the total number.

Definition at line 8345 of file m_neuro.f90.

◆ perception_food_mass_above_calculate()

elemental real(srp) function the_neurobio::perception_food_mass_above_calculate ( class(perception), intent(in)  this)

Calculate the average mass of a food item from all the items in the current perception object that are above the actor agent.

Implementation details

First, initialise the return average mass and the counter for calculating the average both to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average mass of the food items above is done by concurrent looping through the food items within the perception object.

This is done by checking the condition:

   if ( food_item .above. this ) ...
Note
This uses the user defined operator .above. that is implemented in the_environment module.

The average mass of the food items is calculated using the food items mass values returned by the function get_mass (the_environment::food_item::get_mass()) .

Final average value is calculated, obviously, by division of the total mass by the total count. In case the count is zero, also return zero mean.

Definition at line 8390 of file m_neuro.f90.

◆ perception_food_mass_above_horiz_calculate()

elemental real(srp) function the_neurobio::perception_food_mass_above_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the average mass of a food item from all the items in the current perception object that are above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
Average mass of food items within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the return average mass and the counter for calculating the average both to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average mass of the food items above is done by concurrent looping through the food items within the perception object.

The average mass of the food items is calculated using the food items mass values returned by the function get_mass (the_environment::food_item::get_mass()) .

Final average value is calculated, obviously, by division of the total mass by the total count. In case the count is zero, also return zero mean.

Definition at line 8450 of file m_neuro.f90.

◆ perception_conspecifics_below_calculate()

elemental integer function the_neurobio::perception_conspecifics_below_calculate ( class(perception), intent(in)  this)

Calculate the number of conspecifics in the perception object that are located below the actor agent.

Returns
The number of conspecifics within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the conspecifics and count their total number.

Definition at line 8512 of file m_neuro.f90.

◆ perception_conspecifics_above_calculate()

elemental integer function the_neurobio::perception_conspecifics_above_calculate ( class(perception), intent(in)  this)

Calculate the number of conspecifics in the perception object that are located above the actor agent.

Returns
The number of conspecifics within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the conspecifics and count their total number.

Definition at line 8539 of file m_neuro.f90.

◆ perception_conspecifics_below_horiz_calculate()

elemental integer function the_neurobio::perception_conspecifics_below_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of conspecifics in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of conspecifics within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the conspecifics within the later and calculate their number.

Definition at line 8569 of file m_neuro.f90.

◆ perception_conspecifics_above_horiz_calculate()

elemental integer function the_neurobio::perception_conspecifics_above_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of conspecifics in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of conspecifics within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the conspecifics within and calculate their number.

Definition at line 8613 of file m_neuro.f90.

◆ perception_predator_below_calculate()

elemental integer function the_neurobio::perception_predator_below_calculate ( class(perception), intent(in)  this)

Calculate the number of predators in the perception object that are located below the actor agent.

Returns
The number of predators within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the predators and count their total number.

Definition at line 8654 of file m_neuro.f90.

◆ perception_predator_above_calculate()

elemental integer function the_neurobio::perception_predator_above_calculate ( class(perception), intent(in)  this)

Calculate the number of predators in the perception object that are located above the actor agent.

Returns
The number of predators within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the predators and count their total number.

Definition at line 8681 of file m_neuro.f90.

◆ perception_predator_below_horiz_calculate()

elemental integer function the_neurobio::perception_predator_below_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of predators in the perception object that are located below the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z+hz_lower, z+hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of predators within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the predators within the later and calculate their number.

Definition at line 8711 of file m_neuro.f90.

◆ perception_predator_above_horiz_calculate()

elemental integer function the_neurobio::perception_predator_above_horiz_calculate ( class(perception), intent(in)  this,
real(srp), intent(in)  hz_lower,
real(srp), intent(in)  hz_upper 
)

Calculate the number of predators in the perception object that are located above the actor agent within a specific vertical horizon [hz_lower,hz_upper]. The horizon limits are relative, in that they start from the depth position of the this actor agent: [z-hz_upper, z-hz_upper].

Parameters
[in]hz_lowerhz_lower The upper limit for the vertical horizon
[in]hz_upperhz_upper The lower limit for the vertical horizon
Returns
The number of predators within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Loop through the predators within and calculate their number.

Definition at line 8755 of file m_neuro.f90.

◆ perception_food_dist_below_calculate()

elemental real(srp) function the_neurobio::perception_food_dist_below_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all food items in the current perception object that are below the actor agent.

Returns
The average distance to food items within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average distance to the food items below is done by concurrent looping through the food items within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .below. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean. Note that zero is not returned here because zero distance to food item would result in the highest probability of capture which is not what is intended (zero probability should be invoked for null food items).

Definition at line 8796 of file m_neuro.f90.

◆ perception_food_dist_above_calculate()

elemental real(srp) function the_neurobio::perception_food_dist_above_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all food items in the current perception object that are above the actor agent.

Returns
The average distance to food items within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any food items in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one food item in the perception object. Calculation of the average distance to the food items above is done by concurrent looping through the food items within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .above. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean. Note that zero is not returned here because zero distance to food item would result in the highest probability of capture which is not what is intended (zero probability should be invoked for null food items).

Definition at line 8852 of file m_neuro.f90.

◆ perception_consp_dist_below_calculate()

elemental real(srp) function the_neurobio::perception_consp_dist_below_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all conspecifics in the current perception object that are below the actor agent.

Returns
The average distance to conspecifics within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Calculation of the average distance to the conspecifics below is done by concurrent looping through the conspecifics within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .below. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean.

Definition at line 8908 of file m_neuro.f90.

◆ perception_consp_dist_above_calculate()

elemental real(srp) function the_neurobio::perception_consp_dist_above_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all conspecifics in the current perception object that are above the actor agent.

Returns
The average distance to conspecifics within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any conspecifics in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Calculation of the average distance to the conspecifics above is done by concurrent looping through the conspecifics within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .above. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean.

Definition at line 8961 of file m_neuro.f90.

◆ perception_predator_dist_below_calculate()

elemental real(srp) function the_neurobio::perception_predator_dist_below_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all predators in the current perception object that are below the actor agent.

Returns
The average distance to predators within the perception object that are located below (under) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Calculation of the average distance to the predators below is done by concurrent looping through the predators within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .below. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean.

Definition at line 9014 of file m_neuro.f90.

◆ perception_predator_dist_above_calculate()

elemental real(srp) function the_neurobio::perception_predator_dist_above_calculate ( class(perception), intent(in)  this)

Calculate the average distance to all predators in the current perception object that are above the actor agent.

Returns
The average distance to predators within the perception object that are located above (over) the actor agent.

Implementation details

First, initialise the return average and the counter to zero.

Then, check if the agent has any predators in the perception; if not, return zero straight away.

From now on it is assumed that the agent has at least one conspecific in the perception object. Calculation of the average distance to the predators above is done by concurrent looping through the predators within the perception object and calculating the distance from the agent.

This is done by checking the condition:

   if ( food_item .above. this ) ...

Final average value is calculated, obviously, by division of the total distance by the count. In case the count is zero, also return commondata::missing mean.

Definition at line 9067 of file m_neuro.f90.

◆ predator_capture_probability_calculate_spatobj()

real(srp) function the_neurobio::predator_capture_probability_calculate_spatobj ( class(perception), intent(in)  this,
class(spatialobj_percept_comp), intent(in)  this_predator,
real(srp), intent(in)  attack_rate,
logical, intent(in), optional  is_freezing,
integer, intent(in), optional  time_step_model 
)

Calculate the probability of attack and capture of the this agent by the predator this_predator. This probability is a function of the distance between the predator and the agent and is calculated by the predator-class-bound procedure the_environment::predator::risk_fish(). Example call:

   risk=proto_parents%individual(ind)%risk_pred(                          &
     proto_parents%individual(ind)%perceive_predator%predators_seen(i),   &
     proto_parents%individual(ind)%perceive_predator%predators_attack_rates(i))
Note
Note that this version of the procedure accepts this_predator parameter as class the_neurobio::spatialobj_percept_comp that is used for keeping the predator representations in the perception object. This representation keeps two separate array for the_neurobio::spatialobj_percept_comp spatial objects and the attack rate.
Parameters
[in]this_predatorthis_predator the predator that is about to attack the agent.
Note
Note that the predator has the SPATIALOBJ_PERCEPT_COMP type that is used in the predator perception object
Parameters
[in]attack_rateattack_rate attack rate of the predator.
Note
Note that the predator perception object keeps a separate array of the attack rate.
Parameters
[in]is_freezingis_freezing optional logical flag indicating that the fish prey agent is immobile (freezing) that would result in reduced predation risk. Default value is FALSE.
[in]time_step_modeltime_step_model optional time step of the model, if absent, set from the current time step commondata::global_time_step_model_current.

Checks

First, check if the agent has any predators and return zero and exit if there are no predators in the agent's perception object.

Note
This assumes that the predator is much larger than the agent, so the visual range the agent has for detecting the predator is longer than the visual range of the predator for detecting the prey agent.
Warning
The version working with the agent's perception component the_neurobio::predator_capture_probability_calculate_pred() returns a small non-zero probability of capture in contrast to this version accepting this_predator object as a class SPATIALOBJ_PERCEPT_COMP. This is because the former normally calculated the objective predation risk whereas this version, subjective risk in the agent's perception. The agent cannot be aware of a predator that is outside of its perception.

Second, Check optional time step parameter. If unset, use global variable commondata::global_time_step_model_current.

Third, create a temporary PREDATOR type object using the standard method make. The body size and the spatial position are obtained directly from the this_predator object. However, the attack rate is obtained from the second dummy argument attack_rate to this procedure.

Implementation

Calculate the distance between the agent and predator.

Set the debug plot file name that will be passed to the predator-class-bound function the_environment::predator::risk_fish().

Calculate the probability of capture of the this prey agent by the predator. See the_environment::predator::risk_fish() for the details of the calculation.

Definition at line 9134 of file m_neuro.f90.

◆ predator_capture_probability_calculate_pred()

real(srp) function the_neurobio::predator_capture_probability_calculate_pred ( class(perception), intent(in)  this,
class(predator), intent(in)  this_predator,
logical, intent(in), optional  is_freezing,
integer, intent(in), optional  time_step_model 
)

Calculate the probability of attack and capture of the this agent by the predator this_predator. This probability is a function of the distance between the predator and the agent and is calculated by the predator-class-bound procedure the_environment::predator::risk_fish().

Note
Note that this version of the procedure accepts this_predator parameter as class the_neurobio::predator, i.e. for the objective predator object.
Parameters
[in]this_predatorthis_predator the predator that is about to attack the agent.
[in]is_freezingis_freezing optional logical flag indicating that the fish prey agent is immobile (freezing) that would result in reduced predation risk. Default value is FALSE.
[in]time_step_modeltime_step_model optional time step of the model, if absent, set from the current time step commondata::global_time_step_model_current.

Checks

First, check if the agent has any predators in the perception object. Return a near-zero value defined by the commondata::predator_attack_capture_probability_min parameter constant, and exit if there are no predators in the agent's perception object.

Note
This assumes that the predator is much larger than the agent, so the visual range the agent has for detecting the predator is longer than the visual range of the predator for detecting the prey agent.
Warning
The version working with the agent's perception component the_neurobio::predator_capture_probability_calculate_spatobj() returns zero probability in contrast to this version accepting this_predator object as a type PREDATOR. This is because the former normally calculated the subjective assessment of the predation risk whereas this version, objective risk.

Second, Check optional time step parameter. If unset, use global variable commondata::global_time_step_model_current.

Implementation

Calculate the distance between the agent and predator.

Set the debug plot file name that will be passed to the predator-class-bound function the_environment::predator::risk_fish().

Calculate the probability of capture of the this prey agent by the predator. See the_environment::predator::risk_fish() for the details of the calculation.

Definition at line 9253 of file m_neuro.f90.

◆ predation_capture_probability_risk_wrapper()

real(srp) function the_neurobio::predation_capture_probability_risk_wrapper ( class(perception), intent(in)  this,
logical, intent(in), optional  is_freezing 
)

Calculate the overall direct predation risk for the agent, i.e. the probability of attack and capture by the nearest predator.

Parameters
[in]is_freezingis_freezing optional logical flag indicating that the fish prey agent is immobile (freezing) that would result in reduced predation risk. Default value is FALSE.
Returns
Returns the probability of capture by the nearest predator.

Definition at line 9344 of file m_neuro.f90.

◆ get_prop_size()

elemental real(srp) function the_neurobio::get_prop_size ( class(spatial), intent(in)  this)

Get the body size property of a polymorphic object. The object can be of the following extension of the basic the_environment::spatial class:

Note
Other specific classes can be similarly implemented.
Warning
This is not a type-bound function because the base class the_environment::spatial is defined in a different down-level module. Usage: M = get_props_size(object).
Returns
the body size of the input the_environment::spatial class object.

Implementation notes

Get the properties of the conspecific from the perception object or real physical conspecific data. This is done by determining the this data type with "select type" construct.

Definition at line 9384 of file m_neuro.f90.

◆ get_prop_mass()

elemental real(srp) function the_neurobio::get_prop_mass ( class(spatial), intent(in)  this)

Get the body mass property of a polymorphic object. The object can be of the following extension of the basic the_environment::spatial class:

Note
Other specific classes can be similarly implemented.
Warning
This is not a type-bound function because the base class the_environment::spatial is defined in a different down-level module. Usage: M = get_props_mass(object).
Returns
the body mass of the input the_environment::spatial class object.

Implementation notes

Get the properties of the conspecific from the perception object or real physical conspecific data. This is done by determining the this data type with "select type" construct.

Definition at line 9425 of file m_neuro.f90.

Variable Documentation

◆ modname

character (len=*), parameter, private the_neurobio::modname = "(THE_NEUROBIO)"
private

Definition at line 25 of file m_neuro.f90.