You're looking at a specific version of this model. Jump to the model overview.
hazelnutcloud /solar-10.7b-instruct-uncensored:4a52238e
            
              
                
              
            
            Input schema
          
        The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.
| Field | Type | Default value | Description | 
|---|---|---|---|
| prompt | 
           
            string
            
           
         | 
        
          
             
              Tell me a random fact about the universe. Did you know that 
             
          
          
          
         | 
        
           
            The prompt to generate text from.
           
         | 
      
| max_tokens | 
           
            integer
            
           
         | 
        
          
             
              128
             
          
          
          
         | 
        
           
            The maximum number of tokens to generate. If max_tokens <= 0 or None, the maximum number of tokens to generate is unlimited and depends on n_ctx.
           
         | 
      
| temperature | 
           
            number
            
           
         | 
        
          
             
              0.8
             
          
          
          
         | 
        
           
            The temperature to use for sampling.
           
         | 
      
| top_p | 
           
            number
            
           
         | 
        
          
             
              0.95
             
          
          
          
         | 
        
           
            The top-p value to use for nucleus sampling.
           
         | 
      
| top_k | 
           
            number
            
           
         | 
        
          
             
              40
             
          
          
          
         | 
        
           
            The top-k value to use for top-k sampling.
           
         | 
      
| presence_penalty | 
           
            number
            
           
         | 
        
          
             
              0
             
          
          
          
         | 
        
           
            The penalty to apply to tokens based on their presence in the prompt.
           
         | 
      
            
              
                
              
            
            Output schema
          
        The shape of the response you’ll get when you run this model with an API.
              Schema
            
            {'items': {'type': 'string'},
 'title': 'Output',
 'type': 'array',
 'x-cog-array-display': 'concatenate',
 'x-cog-array-type': 'iterator'}