How to describe query parameters in DRF Docs? - python

I'm using Django REST Framework v3.9 built-in interactive documentation.
I have a url need query parameters for get.
such as:
../jobs/?order_choice=0&city=1&region=0
But i don't know how to documenting it in interactive documentation.
I use method: to add parameters:
such as:
class JobListView(APIView):
"""
get:
- order_choices
- city
- region
- job_type
"""
but it print in one line
- order_choices - city - region - job_type
it's my parameters/
params_data = {
'city': request.query_params.get('city', None),
'region': request.query_params.get('region', None),
'job_type': request.query_params.get('job_type', None),
'status': 1,
}
I want to know how to documenting it correctly.

Try something like below ...
class PackageViewSet(viewsets.ModelViewSet):
"""
** Query Parameters **
`page` - get data of a particular page.
`page_size` - change total objects in a page (default=20).
** Filter Parameters **
`status` - `1/2`
** Search Parameters **
`name` - `search by package name`
** Ordering Parameters **
`name`
`created_at`
`updated_at`
** Default Ordering **
`-created_at`
`Method Allowed`
`GET -` `Lists all the Packages of a facility/company.`
`POST -` `Creates Package for a facility.`
`PUT -` `Updates a Package.`
`DELETE -` `deletes a Package.`
`POST/Create, PUT/Update`
{
"name": "package one",
"description": "package one",
"status": 1 // 1- Active, 2-Inactive
}
"""

Related

can not import technical library - freqtrade

it seems to be a problem on where python is searching for de library and not finding it.
still, im very new in this so may be its another thing.
this is the error (i separated the middle part where i think shows the problem):
ftuser#a5a1d3ed08d3:/freqtrade$ freqtrade backtesting --strategy canal
2022-08-26 03:51:37,394 - freqtrade.configuration.load_config - INFO - Using config: user_data/config.json ...
2022-08-26 03:51:37,483 - freqtrade.loggers - INFO - Verbosity set to 0
2022-08-26 03:51:37,484 - freqtrade.configuration.configuration - INFO - Using max_open_trades: 1 ...
2022-08-26 03:51:37,716 - freqtrade.configuration.configuration - INFO - Using user-data directory: /freqtrade/user_data ...
2022-08-26 03:51:37,718 - freqtrade.configuration.configuration - INFO - Using data directory: /freqtrade/user_data/data/binance ...
2022-08-26 03:51:37,719 - freqtrade.configuration.configuration - INFO - Parameter --cache=day detected ...
2022-08-26 03:51:37,719 - freqtrade.configuration.check_exchange - INFO - Checking exchange...
2022-08-26 03:51:37,741 - freqtrade.configuration.check_exchange - INFO - Exchange "binance" is officially supported by the Freqtrade development team.
2022-08-26 03:51:37,741 - freqtrade.configuration.configuration - INFO - Using pairlist from configuration.
2022-08-26 03:51:37,741 - freqtrade.configuration.config_validation - INFO - Validating configuration ...
2022-08-26 03:51:37,746 - freqtrade.commands.optimize_commands - INFO - Starting freqtrade in Backtesting mode
2022-08-26 03:51:37,746 - freqtrade.exchange.exchange - INFO - Instance is running with dry_run enabled
2022-08-26 03:51:37,746 - freqtrade.exchange.exchange - INFO - Using CCXT 1.92.20
2022-08-26 03:51:37,746 - freqtrade.exchange.exchange - INFO - Applying additional ccxt config: {'options': {'defaultType': 'future'}}
2022-08-26 03:51:37,766 - freqtrade.exchange.exchange - INFO - Applying additional ccxt config: {'options': {'defaultType': 'future'}}
2022-08-26 03:51:37,782 - freqtrade.exchange.exchange - INFO - Using Exchange "Binance"
2022-08-26 03:51:39,052 - freqtrade.resolvers.exchange_resolver - INFO - Using resolved exchange 'Binance'...
2022-08-26 03:51:39,097 - freqtrade.resolvers.iresolver - WARNING - Could not import /freqtrade/user_data/strategies/canal.py due to 'cannot import name 'SSLchannels' from 'technical.indicators' (/home/ftuser/.local/lib/python3.10/site-packages/technical/indicators/init.py)'
2022-08-26 03:51:39,182 - freqtrade - ERROR - Impossible to load Strategy 'canal'. This class does not exist or contains Python code errors.
2022-08-26 03:51:39,182 - freqtrade.exchange.exchange - INFO - Closing async ccxt session.
this is the code in VS Code:
import numpy as np # noqa
import pandas as pd # noqa
from pandas import DataFrame
from freqtrade.strategy import (BooleanParameter, CategoricalParameter, DecimalParameter,
IStrategy, IntParameter)
# --------------------------------
# Add your lib to import here
import talib.abstract as ta
import freqtrade.vendor.qtpylib.indicators as qtpylib
from technical.indicators import SSLchannels
# This class is a sample. Feel free to customize it.
class canal(IStrategy):
INTERFACE_VERSION = 3
# Can this strategy go short?
can_short: bool = False
# Minimal ROI designed for the strategy.
# This attribute will be overridden if the config file contains "minimal_roi".
minimal_roi = {
"60": 0.01,
"30": 0.02,
"0": 0.04
}
# Optimal stoploss designed for the strategy.
# This attribute will be overridden if the config file contains "stoploss".
stoploss = -0.10
# Trailing stoploss
trailing_stop = False
# trailing_only_offset_is_reached = False
# trailing_stop_positive = 0.01
# trailing_stop_positive_offset = 0.0 # Disabled / not configured
# Optimal timeframe for the strategy.
timeframe = '5m'
# Run "populate_indicators()" only for new candle.
process_only_new_candles = True
# These values can be overridden in the config.
use_exit_signal = True
exit_profit_only = False
ignore_roi_if_entry_signal = False
buy_rsi = IntParameter(low=1, high=50, default=30, space='buy', optimize=True, load=True)
sell_rsi = IntParameter(low=50, high=100, default=70, space='sell', optimize=True, load=True)
short_rsi = IntParameter(low=51, high=100, default=70, space='sell', optimize=True, load=True)
exit_short_rsi = IntParameter(low=1, high=50, default=30, space='buy', optimize=True, load=True)
# Number of candles the strategy requires before producing valid signals
startup_candle_count: int = 30
# Optional order type mapping.
order_types = {
'entry': 'limit',
'exit': 'limit',
'stoploss': 'market',
'stoploss_on_exchange': False
}
# Optional order time in force.
order_time_in_force = {
'entry': 'gtc',
'exit': 'gtc'
}
plot_config = {
'main_plot': {
'tema': {},
'sar': {'color': 'white'},
},
'subplots': {
"MACD": {
'macd': {'color': 'blue'},
'macdsignal': {'color': 'orange'},
},
"RSI": {
'rsi': {'color': 'red'},
}
}
}
def informative_pairs(self):
"""
Define additional, informative pair/interval combinations to be cached from the exchange.
These pair/interval combinations are non-tradeable, unless they are part
of the whitelist as well.
For more information, please consult the documentation
:return: List of tuples in the format (pair, interval)
Sample: return [("ETH/USDT", "5m"),
("BTC/USDT", "15m"),
]
"""
return []
def populate_indicators(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
# RSI
dataframe['rsi'] = ta.RSI(dataframe)
return dataframe
def populate_entry_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
dataframe.loc[
(
# Signal: RSI crosses above 30
(qtpylib.crossed_above(dataframe['rsi'], self.buy_rsi.value)) &
(dataframe['volume'] > 0) # Make sure Volume is not 0
),
'enter_long'] = 1
dataframe.loc[
(
# Signal: RSI crosses above 70
(qtpylib.crossed_above(dataframe['rsi'], self.short_rsi.value)) &
(dataframe['volume'] > 0) # Make sure Volume is not 0
),
'enter_short'] = 1
return dataframe
def populate_exit_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
dataframe.loc[
(
# Signal: RSI crosses above 70
(qtpylib.crossed_above(dataframe['rsi'], self.sell_rsi.value)) &
(dataframe['volume'] > 0) # Make sure Volume is not 0
),
'exit_long'] = 1
dataframe.loc[
(
# Signal: RSI crosses above 30
(qtpylib.crossed_above(dataframe['rsi'], self.exit_short_rsi.value)) &
# Guard: tema below BB middle
(dataframe['volume'] > 0) # Make sure Volume is not 0
),
'exit_short'] = 1
return dataframe
I left RSI indicator so that i could coment:
#from technical.indicators import SSLchannels
and test the code is ok and it works. It runs the backtest ok.
Heres how i have the folders in my PC
I also tryed choosing python 3.8 and 3.10 in VS Code just ro try and both work well if i take out technical library and shows error if i put it.
any help would be apreciated.
Thanks!
I would think that either you need to pip install the technical.indicators or use docker. I prefer using docker to execute freqtrade commands as they already have all the dependencies installed.

Airflow Hash "#" in day-of-week field not running appropriately

Hi we are having trouble scheduling task with a # config as the standard says, to configure for the first, second, third, fourth or fifth day of the week for a given month.
Hash (#)
'#' is allowed for the day-of-week field, and must be followed by a number between one and five. It allows specifying constructs such as "the second Friday" of a given month.[19] For example, entering "5#3" in the day-of-week field corresponds to the third Friday of every month.
As is mention here https://en.wikipedia.org/wiki/Cron
Here is my dag config... as you can se, this should run today, the second monday of every month, and it did't run. As this, I have for every day of the month and only on few occasion the task has run successfully. What I'm doing wrong?
with DAG(dag_id='dag_126_73618_3012xx1the2', default_args=args , concurrency=3, catchup=True, tags=['account_126', 'form'], schedule_interval='30 12 * * 1#2', start_date=datetime(2021 ,8, 1, 0, 0)) as dag:
Here is my complete config
from airflow import DAG from airflow.utils.dates import days_ago,
datetime, timedelta from lkf_operator import LKFLogin,
CreateAndAssignTask
args = { 'owner' : 'airflow', 'email' : ['xxx#linkaform.com',
'xxx#linkaform.com'], 'email_on_failure' : True, 'retries' : 3,
'retry_delay' : timedelta(seconds=30), }
params = { 'username' : 'xxx#linkaform.com', 'api_key' :
'xxxxxx', }
with DAG(dag_id='dag_126_73618_3012xx1the2', default_args=args ,
concurrency=3, catchup=True, tags=['account_126', 'form'],
schedule_interval='30 12 * * 1#2', start_date=datetime(2021 ,8, 1, 0,
0)) as dag:
do_run_lkf_login_73618_1 = LKFLogin(
name ='LKF Login',
task_id ='run_lkf_login_73618_1',
params = params
)
do_run_segundo_lunes_del_mes_73618_2 = CreateAndAssignTask(
name ='Segundo Lunes del Mes',
task_id ='run_segundo_lunes_del_mes_73618_2',
params = {'form_id': 73618, 'assinge_user_id': 126, 'answers': {'61148df86f960aaaa3e4e445': 71925, '611492cc5a6316b13ee4e3f2':
'Segundo Lunes del Mes', '61148df86f960aaaa3e4e446': '{% $today %}',
'61148df86f960aaaa3e4e447': 'pendiente'}}
)
do_run_lkf_login_73618_1.set_downstream([do_run_segundo_lunes_del_mes_73618_2])
What is stranger is that for some days it works and from other it dosen't and is the same code, except one or two strings
I am afraid Airflow does not implement the complete crontab with all the flavours. You can check https://crontab.guru/ - this nicely summarizes the crontab specification that is implemented by Airflow.
However, very soon in Airflow 2.2 this problem will be largely solved. See https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-39+Richer+scheduler_interval - in ~ month when Airflow 2.2 is out, you will be able to use much more flexible and easier to define schedule.

Optional job parameter in AWS Glue?

How can I implement an optional parameter to an AWS Glue Job?
I have created a job that currently have a string parameter (an ISO 8601 date string) as an input that is used in the ETL job. I would like to make this parameter optional, so that the job use a default value if it is not provided (e.g. using datetime.now and datetime.isoformatin my case). I have tried using getResolvedOptions:
import sys
from awsglue.utils import getResolvedOptions
args = getResolvedOptions(sys.argv, ['ISO_8601_STRING'])
However, when I am not passing an --ISO_8601_STRING job parameter I see the following error:
awsglue.utils.GlueArgumentError: argument --ISO_8601_STRING is required
matsev and Yuriy solutions is fine if you have only one field which is optional.
I wrote a wrapper function for python that is more generic and handle different corner cases (mandatory fields and/or optional fields with values).
import sys
from awsglue.utils import getResolvedOptions
def get_glue_args(mandatory_fields, default_optional_args):
"""
This is a wrapper of the glue function getResolvedOptions to take care of the following case :
* Handling optional arguments and/or mandatory arguments
* Optional arguments with default value
NOTE:
* DO NOT USE '-' while defining args as the getResolvedOptions with replace them with '_'
* All fields would be return as a string type with getResolvedOptions
Arguments:
mandatory_fields {list} -- list of mandatory fields for the job
default_optional_args {dict} -- dict for optional fields with their default value
Returns:
dict -- given args with default value of optional args not filled
"""
# The glue args are available in sys.argv with an extra '--'
given_optional_fields_key = list(set([i[2:] for i in sys.argv]).intersection([i for i in default_optional_args]))
args = getResolvedOptions(sys.argv,
mandatory_fields+given_optional_fields_key)
# Overwrite default value if optional args are provided
default_optional_args.update(args)
return default_optional_args
Usage :
# Defining mandatory/optional args
mandatory_fields = ['my_mandatory_field_1','my_mandatory_field_2']
default_optional_args = {'optional_field_1':'myvalue1', 'optional_field_2':'myvalue2'}
# Retrieve args
args = get_glue_args(mandatory_fields, default_optional_args)
# Access element as dict with args[‘key’]
Porting Yuriy's answer to Python solved my problem:
if ('--{}'.format('ISO_8601_STRING') in sys.argv):
args = getResolvedOptions(sys.argv, ['ISO_8601_STRING'])
else:
args = {'ISO_8601_STRING': datetime.datetime.now().isoformat()}
There is a workaround to have optional parameters. The idea is to examine arguments before resolving them (Scala):
val argName = 'ISO_8601_STRING'
var argValue = null
if (sysArgs.contains(s"--$argName"))
argValue = GlueArgParser.getResolvedOptions(sysArgs, Array(argName))(argName)
I don't see a way to have optional parameters, but you can specify default parameters on the job itself, and then if you don't pass that parameter when you run the job, your job will receive the default value (note that the default value can't be blank).
Wrapping matsev's answer in a function:
def get_glue_env_var(key, default="none"):
if f'--{key}' in sys.argv:
return getResolvedOptions(sys.argv, [key])[key]
else:
return default
It's possible to create a Step Function that starts the same Glue job with different parameters. The state machine starts with a Choice state and uses different number of inputs depending on which is present.
stepFunctions:
stateMachines:
taskMachine:
role:
Fn::GetAtt: [ TaskExecutor, Arn ]
name: ${self:service}-${opt:stage}
definition:
StartAt: DefaultOrNot
States:
DefaultOrNot:
Type: Choice
Choices:
- Variable: "$.optional_input"
IsPresent: false
Next: DefaultTask
- Variable: "$. optional_input"
IsPresent: true
Next: OptionalTask
OptionalTask:
Type: Task
Resource: "arn:aws:states:::glue:startJobRun.task0"
Parameters:
JobName: ${self:service}-${opt:stage}
Arguments:
'--log_group.$': "$.specs.log_group"
'--log_stream.$': "$.specs.log_stream"
'--optional_input.$': "$. optional_input"
Catch:
- ErrorEquals: [ 'States.TaskFailed' ]
ResultPath: "$.errorInfo"
Next: TaskFailed
Next: ExitExecution
DefaultTask:
Type: Task
Resource: "arn:aws:states:::glue:startJobRun.sync"
Parameters:
JobName: ${self:service}-${opt:stage}
Arguments:
'--log_group.$': "$.specs.log_group"
'--log_stream.$': "$.specs.log_stream"
Catch:
- ErrorEquals: [ 'States.TaskFailed' ]
ResultPath: "$.errorInfo"
Next: TaskFailed
Next: ExitExecution
TaskFailed:
Type: Fail
Error: "Failure"
ExitExecution:
Type: Pass
End: True
If you're using the interface, you must provide your parameter names starting with "--" like "--TABLE_NAME", rather than "TABLE_NAME", then you can use them like the following (python) code:
args = getResolvedOptions(sys.argv, ['JOB_NAME', 'TABLE_NAME'])
table_name = args['TABLE_NAME']

Fixtures in Golang testing

Coming from the python world, fixtures are very useful (Fixtures defines a Python contract for reusable state / support logic, primarily for unit testing). I was wondering if there's similar support in Golang which can allow me to run my tests with some predefined fixtures like setting up server, tearing it down, doing some repeated tasks each time a test is run ? Can someone point me to some examples of doing the same in Golang ?
If you want to use the standard Go testing tools, you can define a function with the signature TestMain(m *testing.M) and put your fixture code in there.
From the testing package wiki:
It is sometimes necessary for a test program to do extra setup or teardown before or after testing. It is also sometimes necessary for a test to control which code runs on the main thread. To support these and other cases, if a test file contains a function:
func TestMain(m *testing.M)
then the generated test will call TestMain(m) instead of running the tests directly. TestMain runs in the main goroutine and can do whatever setup and teardown is necessary around a call to m.Run. It should then call os.Exit with the result of m.Run. When TestMain is called, flag.Parse has not been run. If TestMain depends on command-line flags, including those of the testing package, it should call flag.Parse explicitly.
A simple implementation of TestMain is:
func TestMain(m *testing.M) {
flag.Parse()
os.Exit(m.Run())
}
I know this is an old question, but this still came up in a search result so I thought I'd give a possible answer.
You can isolate code out to helper functions that return a "teardown" function to clean up after itself. Here's one possible way to go about starting a server and have it close at the end of the test case.
func setUpServer() (string, func()) {
h := func(w http.ResponseWriter, r *http.Request) {
code := http.StatusTeapot
http.Error(w, http.StatusText(code), code)
}
ts := httptest.NewServer(http.HandlerFunc(h))
return ts.URL, ts.Close
}
func TestWithServer(t *testing.T) {
u, close := setUpServer()
defer close()
rsp, err := http.Get(u)
assert.Nil(t, err)
assert.Equal(t, http.StatusTeapot, rsp.StatusCode)
}
This starts up a server with net/http/httptest and returns its URL along with a function that acts as the "teardown." This function is added to the defer stack so it's always called regardless of how the test case exits.
Optionally, you can pass in the *testing.T if you have a more complicated set up going on in there and you need to handle errors. This example shows the set up function returning a *url.URL instead of a URL formatted string, and the parse can possibly return an error.
func setUpServer(t *testing.T) (*url.URL, func()) {
h := func(w http.ResponseWriter, r *http.Request) {
code := http.StatusTeapot
http.Error(w, http.StatusText(code), code)
}
ts := httptest.NewServer(http.HandlerFunc(h))
u, err := url.Parse(ts.URL)
assert.Nil(t, err)
return u, ts.Close
}
func TestWithServer(t *testing.T) {
u, close := setUpServer(t)
defer close()
u.Path = "/a/b/c/d"
rsp, err := http.Get(u.String())
assert.Nil(t, err)
assert.Equal(t, http.StatusTeapot, rsp.StatusCode)
}
I wrote golang engine for use fixtures similar to pytest:
https://github.com/rekby/fixenv
Usage example:
package example
// db create database abd db struct, cached per package - call
// once and same db shared with all tests
func db(e Env)*DB{...}
// DbCustomer - create customer with random personal data
// but fixed name. Fixture result shared by test and subtests,
// then mean many calls Customer with same name will return same
// customer object.
// Call Customer with other name will create new customer
// and resurn other object.
func DbCustomer(e Env, name string) Customer {
// ... create customer
db(e).CustomerStore(cust)
// ...
return cust
}
// DbAccount create bank account for customer with given name.
func DbAccount(e Env, customerName, accountName string)Account{
cust := DbCustomer(e, customerName)
// ... create account
db(e).AccountStore(acc)
// ...
return acc
}
func TestFirstOwnAccounts(t *testing.T){
e := NewEnv(t)
// background:
// create database
// create customer bob
// create account from
accFrom := DbAccount(e, "bob", "from")
// get existed db, get existed bob, create account to
accTo := DbAccount(e, "bob", "to")
PutMoney(accFrom, 100)
SendMoney(accFrom, accTo, 20)
if accFrom != 80 {
t.Error()
}
if accTo != 20 {
t.Error()
}
// background:
// delete account to
// delete account from
// delete customer bob
}
func TestSecondTransferBetweenCustomers(t *testing.T){
e := NewEnv(t)
// background:
// get db, existed from prev test
// create customer bob
// create account main for bob
accFrom := DbAccount(e, "bob", "main")
// background:
// get existed db
// create customer alice
// create account main for alice
accTo := DbAccount(e, "alice", "main")
PutMoney(accFrom, 100)
SendMoney(accFrom, accTo, 20)
if accFrom != 80 {
t.Error()
}
if accTo != 20 {
t.Error()
}
// background:
// remove account of alice
// remove customer alice
// remove account of bob
// remove customer bob
}
// background:
// after all test finished drop database

Django - How to display Scientific Notation on Admin Page Field?

I have a field in my admin page that I'd like to display in Scientific Notation.
Right now it displays something ugly like this. How can I display this as 4.08E+13?
Right now I'm using a standard Decimal field in the model.
Any advice is greatly appreciated.
I'm on Django 1.2.
You have to use %e to get the scientific notation format:
Basic Example:
x = 374.534
print("%e" % x)
# 3.745340e+02
Precision of 2
x = 374.534
print("{0:.2E}".format(x))
# 3.75E+02
x = 12345678901234567890.534
print("{0:.2E}".format(x))
# 1.23E+19
Precision of 3
print("{0:.3E}".format(x))
# 1.235E+19
Well, here's a work around since I can't figure out how to do this within the Django Python code. I have the admin pages run some custom javascript to do the conversion after the page is loaded.
Details:
Create this javascript file called "decimal_to_sci_not.js" and place it in your media directory:
/*
Finds and converts decimal fields > N into scientific notation.
*/
THRESHOLD = 100000;
PRECISION = 3;
function convert(val) {
// ex. 100000 -> 1.00e+5
return parseFloat(val).toPrecision(PRECISION);
}
function convert_input_fields() {
var f_inputs = django.jQuery('input');
f_inputs.each(function (index, domEl) {
var jEl = django.jQuery(this);
var old_val = parseFloat(jEl.val());
if (old_val >= THRESHOLD) {
jEl.val(convert(old_val));
}
});
}
function convert_table_cells() {
//Look through all td elements and replace the first n.n number found inside
//if greater than or equal to THRESHOLD
var cells = django.jQuery('td');
var re_num = /\d+\.\d+/m; //match any numbers w decimal
cells.each(function (index, domEl) {
var jEl = django.jQuery(this);
var old_val_str = jEl.html().match(re_num);
var old_val = parseFloat(old_val_str);
if (old_val >= THRESHOLD) {
jEl.html(jEl.html().replace(old_val_str,convert(old_val)));
}
});
}
django.jQuery(document).ready( function () {
convert_input_fields();
convert_table_cells();
});
Then update your admin.py code classes to include the javascript file:
class MyModel1Admin(admin.ModelAdmin):
class Media:
js = ['/media/decimal_to_sci_not.js']
admin.site.register(MyModel1,MyModel1Admin)

Categories