first commit

This commit is contained in:
yumoqing 2025-07-16 14:18:40 +08:00
commit 0b2584d6aa
38 changed files with 3684 additions and 0 deletions

426
README.md Executable file
View File

@ -0,0 +1,426 @@
# ahserver
ahserver is a http(s) server base on aiohttp asynchronous framework.
ahserver capabilities:
* user authorization and authentication support
* https support
* processor for registed file type
* pre-defined variables and function can be called by processors
* multiple database connection and connection pool
* a easy way to wrap SQL
* configure data from json file stored at ./conf/config.json
* upload file auto save under config.filesroot folder
* i18n support
* processors include:
+ 'dspy' file subffix by '.dspy', is process as a python script
+ 'tmpl' files subffix by '.tmpl', is process as a template
+ 'md' files subffix by '.md', is process as a markdown file
+ 'xlsxds' files subffix by '.xlsxds' is process as a data source from xlsx file
+ 'sqlds' files subffixed by '.sqlds' is process as a data source from database via a sql command
## python3.12 bug fix
We use aioredis, it use distutils, but above 3.12, distutils not exists, so we need to hack it a bit.
### new model
```
pip install packaging
```
need modify files:
* aioredis/exceptions.py
* aioredis/connection.py
```replace aioredis/connection.py line 11 with
from packaging.version import Version as StrictVersion
```
replace aiotedis/exceptions.py 14 line with:
```
class TimeoutError(asyncio.TimeoutError, RedisError):
```
## Requirements
see requirements.txt
[pyutils](https://github.com/yumoqing/pyutils)
[sqlor](https://github.com/yumoqing/sqlor)
## How to use
see ah.py
```
from ahserver.configuredServer import ConfiguredServer
if __name__ == '__main__':
server = ConfiguredServer()
server.run()
```
## Folder structure
+ app
+ |-ah.py
+ |--ahserver
+ |-conf
+ |-config.json
+ |-i18n
## Configuration file content
ahserver using json file format in its configuration, the following is a sample:
```
{
"databases":{
"aiocfae":{
"driver":"aiomysql",
"async_mode":true,
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
},
"cfae":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"cfae",
"kwargs":{
"user":"test",
"db":"cfae",
"password":"test123",
"host":"localhost"
}
}
},
"website":{
"paths":[
["$[workdir]$/../usedpkgs/antd","/antd"],
["$[workdir]$/../wolon",""]
],
"host":"0.0.0.0",
"port":8080,
"coding":"utf-8",
"ssl":{
"crtfile":"$[workdir]$/conf/www.xxx.com.pem",
"keyfile":"$[workdir]$/conf/www.xxx.com.key"
},
"indexes":[
"index.html",
"index.tmpl",
"index.dspy",
"index.md"
],
"visualcoding":{
"default_root":"/samples/vc/test",
"userroot":{
"ymq":"/samples/vc/ymq",
"root":"/samples/vc/root"
},
"jrjpath":"/samples/vc/default"
},
"processors":[
[".xlsxds","xlsxds"],
[".sqlds","sqlds"],
[".tmpl.js","tmpl"],
[".tmpl.css","tmpl"],
[".html.tmpl","tmpl"],
[".tmpl","tmpl"],
[".dspy","dspy"],
[".md","md"]
]
},
"langMapping":{
"zh-Hans-CN":"zh-cn",
"zh-CN":"zh-cn",
"en-us":"en",
"en-US":"en"
}
}
```
### database configuration
the ahserver using packages for database engines are:
* oracle:cx_Oracle
* mysql:mysql-connector
* postgresql:psycopg2
* sql server:pymssql
however, you can change it, but must change the "driver" value the the package name in the database connection definition.
in the databases section in config.json, you can define one or more database connection, and also, it support many database engine, just as ORACLE,mysql,postgreSQL.
define a database connnect you need follow the following json format.
* mysql or mariadb
```
"metadb":{
"driver":"mysql.connector",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"db":"sampledb",
"password":"user123",
"host":"localhost"
}
}
```
the dbname and "db" should the same, which is the database name in mysql database
* Oracle
```
"db_ora":{
"driver":"cx_Oracle",
"coding":"utf8",
"dbname":sampledb",
"kwargs":{
"user":"user1",
"host":"localhost",
"dsn":"10.0.185.137:1521/SAMPLEDB"
}
}
```
* SQL Server
```
"db_mssql":{
"driver":"pymssql",
"coding":"utf8",
"dbname":"sampledb",
"kwargs":{
"user":"user1",
"database":"sampledb",
"password":"user123",
"server":"localhost",
"port":1433,
"charset":"utf8"
}
}
```
* PostgreSQL
```
"db_pg":{
"driver":"psycopg2",
"dbname":"testdb",
"coding":"utf8",
"kwargs":{
"database":"testdb",
"user":"postgres",
"password":"pass123",
"host":"127.0.0.1",
"port":"5432"
}
}
```
### https support
In config.json file, config.website.ssl need to set(see above)
### website configuration
#### paths
ahserver can serve its contents (static file, dynamic contents render by its processors) resided on difference folders on the server file system.
ahserver finds a content identified by http url in order the of the paths specified by "paths" lists inside "website" definition of config.json file
#### processors
all the prcessors ahserver using, must be listed here.
#### host
by defaualt, '0.0.0.0'
#### port
by default, 8080
#### coding
ahserver recomments using 'utf-8'
### langMapping
the browsers will send 'Accept-Language' are difference even if the same language. so ahserver using a "langMapping" definition to mapping multiple browser lang to same i18n file
## international
ahserver using MiniI18N in appPublic modules in pyutils package to implements i18n support
it will search translate text in ms* txt file in folder named by language name inside i18n folder in workdir folder, workdir is the folder where the ahserver program resided or identified by command line paraments.
## performance
To be list here
## Behind the nginx
when ahserver running behind the nginx, nginx should be forward following header to ahserver
* X-Forwarded-For: client real ip
* X-Forwarded-Scheme: scheme in client browser
* X-Forwarded-Host: host in client browser
* X-Forwarded-Url: url in client browser
* X-Forwarded-Prepath: subfolder name if if ahserver is behind nginx and use subfolder proxy.
## environment for processors
When coding in processors, ahserver provide some environment stuff for build apllication, there are modules, functions, classes and variables
### session environment
* async get_user()
a coroutine to get userid if user not login, it return None
* async remember_user(userid, username='', userorgid='')
a coroutine to set session user info: userid, name, orgid
* async forget_user()
a coroutine to forget session user information, and get_user() will return None
* async redirect(url)
a coroutine to redirect request to a new url
* entire_url(url)
a function to convert url to a url with http(s)://servername:port/repath/.... format, a outside url will return argument's url without change.
* aiohttp_client
aiohttp_client is aiohttp.client class to make a new request to other server
* gethost()
a function to get client ip
* async path_call(path, **kw)
a coroutine to call other source in server with path
* params_kw
dictionary to storages data tranafers from client. if files upload from client, upload file stored under the folder defined in configure file named by "files", the params_kw only storage the subpath under "files" defined folder.
### global environment
### modules:
* time
* datetime
* random
* json
### functions:
* configValue(k):
function return configuration file value in k, k is start with '.', examples: configValue('.website') will return website value in configuration file; configValue('.website.port') will return port under website in configuration file.
* isNone(v)
a function check v is or not None, if is return True, else return False
* int(v)
a function to convert v to integer
* str(v)
a function to convert v to string
* float(v)
a function to convert v to float
* type(v)
a function to get v's type
* str2date(dstr)
a function to convert string with "YYYY-MM-DD" format to datetime.datetime instance
* str2datetime(dstr)
a function to convert string with "YYYY-MM-DD" format to datetime.datetime instance
* curDatetime()
a function to get current date and time in datetime.datetime instsance
* uuid()
a function to get a uuid value
* DBPools()
a function to get a db connection from sqlor connection pool, further infor see [sqlor](https://git.kaiyuancloud.cn/yumoqing/sqlor)
all the databases it can connected to need to defiend in 'databases' in configuration file.
CRUD use case:
by use CRUD, the table must have a id field as primay key.
CRUD use case 1(insert data to table. in a insert.dspy file)
```
db = DBPools()
async with db.sqlorContext('dbname1') as sor:
ns = {
'id':uuid(),
'field1':1
}
recs = await sor.C('tbl1', ns)
```
CRUD use case 2(update data in table. in a update.dspy file)
```
ns = params_kw.copy() # get data from client
db = DBPools()
async with db.sqlorContext('dbname1') as sor:
await sor.U('tbl1', ns)
```
CRUD use case 3(delete data in table. in a delete.dspy file)
```
ns = {
'id':params_kw.id
}
db = DBPools()
async with db.sqlorContext('dbname1') as sor:
await sor.D('tbl1', ns)
```
CRUD use case 4(query date from table, in a search.dspy file)
```
ns = params_kw.copy()
db = DBPools()
async with db.sqlorContext('dbname1') as sor:
recs = await sor.R('tbl1', ns)
# recs is d list with element is a DictObject instance with all the table fields data
return recs
```
CRUD use case 5(paging query data from table, in a search_paging.dspy file)
```
ns = params_kw.copy()
if ns.get('page') is None:
ns['page'] = 1
if ns.get('sort') is None:
ns['sort'] = 'id desc'
db = DBPools()
async with db.sqlorContext('dbname1') as sor:
recs = await sor.RP('tbl1', ns)
# recs is a DictObject instance with two keys: "total": result records, "rows" return data list
# example:
# {
# "total":423123,
# "rows":[ ..... ] max record is "pagerows" in ns, default is 80
# }
return recs
```
SQL EXECUTE use case 1
```
sql = "..... where id=${id}$ and field1 = ${var1}$ ..."
db = DBPools()
async with db.sqlorContext('dbname') as sor:
r = await sor.sqlExe(sql, {'id':'iejkuiew', 'var1':1111})
# if sql is a select command, r is a list with data returned, is a instance of DictObject
....
```
SQL EXECUTE use case 2
```
sql = "..... where id=${id}$ and field1 = ${var1}$ ..."
db = DBPools()
async with db.sqlorContext('dbname') as sor:
r = await sor.sqlPaging(sql, {'id':'iejkuiew',
'page':1,
'pagerows':60,
'sort':'field1',
'var1':1111})
# r is a DictObject instance with two keys: "total": result records, "rows" return data list
# example:
# {
# "total":423123,
# "rows":[ ..... ] max record is "pagerows" in ns, default is 80
# }
....
```
### variables
* resource
* terminalType
* ArgsConvert
* curDateString
* curTimeString
* monthfirstday
* strdate_add
* webpath
* stream_response
* rfexe
* basic_auth_headers
* format_exc
* realpath
* save_file
* async_sleep
* DictObject
*
### classes
* ArgsConvert

22
ah.py Executable file
View File

@ -0,0 +1,22 @@
from ahserver.configuredServer import ConfiguredServer
from ahserver.auth_api import AuthAPI
"""
need to implement your AuthAPI
class MyAuthAPI:
def needAuth(self,path):
return Fasle # do not need authentication
return True # need authentication
async def getPermissionNeed(self,path):
return 'admin'
async def checkUserPassword(self,user_id,password):
return True
async def getUserPermissions(self,user):
return ['admin','view']
"""
if __name__ == '__main__':
server = ConfiguredServer(AuthAPI)
server.run()

0
ahserver/__init__.py Normal file
View File

179
ahserver/auth_api.py Normal file
View File

@ -0,0 +1,179 @@
import time
import uuid
from traceback import format_exc
from aiohttp_auth import auth
from aiohttp_auth.auth.ticket_auth import TktAuthentication
from os import urandom
from aiohttp import web
import aiohttp_session
# import aioredis
import redis.asyncio as redis
import base64
import binascii
from aiohttp_session import get_session, session_middleware, Session
from aiohttp_session.cookie_storage import EncryptedCookieStorage
from aiohttp_session.redis_storage import RedisStorage
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.rsawrap import RSA
from appPublic.log import info, debug, warning, error, critical, exception
def get_client_ip(obj, request):
ip = request.headers.get('X-Forwarded-For')
if not ip:
ip = request.remote
request['client_ip'] = ip
return ip
async def get_session_userinfo(request):
d = await auth.get_auth(request)
if d is None:
return DictObject()
ui = d.split(':')
return DictObject(**{
'userid':ui[0],
'username':ui[1],
'userorgid':ui[2]
})
async def get_session_user(request):
userinfo = await get_session_userinfo(request)
return userinfo.userid
async def user_login(request, userid, username='', userorgid=''):
ui = f'{userid}:{username}:{userorgid}'
await auth.remember(request, ui)
async def user_logout(request):
await auth.forget(request)
class MyRedisStorage(RedisStorage):
def key_gen(self, request):
key = request.headers.get('client_uuid')
if not key:
key = uuid.uuid4().hex
return key
if isinstance(key, str):
key = key.encode('utf-8')
key = binascii.hexlify(key)
key = key.decode('utf-8')
return key
async def save_session(self, request: web.Request,
response: web.StreamResponse,
session: Session) -> None:
key = session.identity
if key is None:
key = self.key_gen(request)
self.save_cookie(response, key, max_age=session.max_age)
else:
if session.empty:
self.save_cookie(response, "", max_age=session.max_age)
else:
key = str(key)
self.save_cookie(response, key, max_age=session.max_age)
data_str = self._encoder(self._get_session_data(session))
await self._redis.set(
self.cookie_name + "_" + key,
data_str,
ex=session.max_age,
)
class AuthAPI:
def __init__(self):
self.conf = getConfig()
async def checkUserPermission(self, request, user, path):
# print('************* checkUserPermission() use default one ****************')
return True
def getPrivateKey(self):
if not hasattr(self,'rsaEngine'):
self.rsaEngine = RSA()
fname = self.conf.website.rsakey.privatekey
self.privatekey = self.rsaEngine.read_privatekey(fname)
return self.privatekey
def rsaDecode(self,cdata):
self.getPrivateKey()
return self.rsaEngine.decode(self.privatekey,cdata)
async def setupAuth(self,app):
# setup session middleware in aiohttp fashion
b = str(self.conf.website.port).encode('utf-8')
cnt = 32 - len(b)
secret = b + b'iqwertyuiopasdfghjklzxcvbnm12345'[:cnt]
storage = EncryptedCookieStorage(secret)
if self.conf.website.session_redis:
url = self.conf.website.session_redis.url
# redis = await aioredis.from_url("redis://127.0.0.1:6379")
redisdb = await redis.Redis.from_url(url)
storage = MyRedisStorage(redisdb)
aiohttp_session.setup(app, storage)
# Create an auth ticket mechanism that expires after 1 minute (60
# seconds), and has a randomly generated secret. Also includes the
# optional inclusion of the users IP address in the hash
session_max_time = 120
session_reissue_time = 30
if self.conf.website.session_max_time:
session_max_time = self.conf.website.session_max_time
if self.conf.website.session_reissue_time:
session_reissue_time = self.conf.website.session_reissue_time
def _new_ticket(self, request, user_id):
client_uuid = request.headers.get('client_uuid')
ip = self._get_ip(request)
valid_until = int(time.time()) + self._max_age
# print(f'hack: my _new_ticket() called ... remote {ip=}, {client_uuid=}')
return self._ticket.new(user_id,
valid_until=valid_until,
client_ip=ip,
user_data=client_uuid)
TktAuthentication._get_ip = get_client_ip
TktAuthentication._new_ticket = _new_ticket
policy = auth.SessionTktAuthentication(secret,
session_max_time,
reissue_time=session_reissue_time,
include_ip=True)
# setup aiohttp_auth.auth middleware in aiohttp fashion
# print('policy = ', policy)
auth.setup(app, policy)
app.middlewares.append(self.checkAuth)
@web.middleware
async def checkAuth(self,request,handler):
info(f'checkAuth() called ... {request.path=}')
t1 = time.time()
path = request.path
userinfo = await get_session_userinfo(request)
user = userinfo.userid
is_ok = await self.checkUserPermission(request, user, path)
t2 = time.time()
ip = get_client_ip(None, request)
if is_ok:
try:
ret = await handler(request)
t3 = time.time()
info(f'timecost=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1})')
return ret
except Exception as e:
t3 = time.time()
tb = format_exc()
exception(f'Exception=client({ip}) {user} access {path} cost {t3-t1}, ({t2-t1}), except={e}\n{tb}')
raise e
if user is None:
info(f'timecost=client({ip}) {user} need login to access {path} ({t2-t1})')
raise web.HTTPUnauthorized
info(f'timecost=client({ip}) {user} access {path} forbidden ({t2-t1})')
raise web.HTTPForbidden()
async def needAuth(self,path):
return False

262
ahserver/baseProcessor.py Normal file
View File

@ -0,0 +1,262 @@
import os
import re
import json
import codecs
import aiofiles
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.folderUtils import listFile
from appPublic.argsConvert import ArgsConvert
from appPublic.log import info, debug, warning, error, critical, exception
from .utils import unicode_escape
from .serverenv import ServerEnv
from .filetest import current_fileno
class ObjectCache:
def __init__(self):
self.cache = {}
def store(self,path,obj):
o = self.cache.get(path,None)
if o is not None:
try:
del o.cached_obj
except:
pass
o = DictObject()
o.cached_obj = obj
o.mtime = os.path.getmtime(path)
self.cache[path] = o
def get(self,path):
o = self.cache.get(path)
if o:
if os.path.getmtime(path) > o.mtime:
return None
return o.cached_obj
return None
class BaseProcessor:
@classmethod
def isMe(self,name):
return name=='base'
def __init__(self,path,resource):
self.env_set = False
self.path = path
self.resource = resource
self.retResponse = None
# self.last_modified = os.path.getmtime(path)
# self.content_length = os.path.getsize(path)
self.headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes'
}
self.content = ''
async def be_call(self, request, params={}):
return await self.path_call(request, params=params)
async def set_run_env(self, request, params={}):
if self.env_set:
return
self.real_path = self.resource.url2file(request.path)
g = ServerEnv()
self.run_ns = DictObject()
self.run_ns.update(g)
self.run_ns.update(self.resource.y_env)
self.run_ns['request'] = request
self.run_ns['app'] = request.app
kw = await self.run_ns['request2ns']()
kw.update(params)
self.run_ns['params_kw'] = kw
# self.run_ns.update(kw)
self.run_ns['ref_real_path'] = self.real_path
self.run_ns['processor'] = self
self.env_set = True
async def execute(self,request):
await self.set_run_env(request)
await self.datahandle(request)
return self.content
def set_response_headers(self, response):
response.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
# response.headers['Access-Control-Allow-Credentials'] = 'true'
# response.headers['Access-Control-Allow-Origin'] = '47.93.12.75'
async def handle(self,request):
await self.execute(request)
jsonflg = False
if self.retResponse is not None:
self.set_response_headers(self.retResponse)
return self.retResponse
elif isinstance(self.content, Response):
return self.content
elif isinstance(self.content, StreamResponse):
return self.content
elif isinstance(self.content, DictObject):
self.content = json.dumps(self.content, indent=4, ensure_ascii=False)
jsonflg = True
elif isinstance(self.content, dict):
self.content = json.dumps(self.content, indent=4, ensure_ascii=False)
jsonflg = True
elif isinstance(self.content, list):
self.content = json.dumps(self.content, indent=4, ensure_ascii=False)
jsonflg = True
elif isinstance(self.content, tuple):
self.content = json.dumps(self.content, indent=4, ensure_ascii=False)
jsonflg = True
elif isinstance(self.content, bytes):
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
self.headers['Content-Length'] = str(len(self.content))
resp = Response(body=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
else:
try:
json.loads(self.content)
jsonflg = True
except:
pass
if jsonflg:
self.headers['Content-Type'] = "application/json; utf-8"
self.headers['Access-Control-Expose-Headers'] = 'Set-Cookie'
resp = Response(text=self.content,headers=self.headers)
self.set_response_headers(resp)
return resp
async def datahandle(self,request):
debug('*******Error*************')
self.content=''
def setheaders(self):
pass
# self.headers['Content-Length'] = str(len(self.content))
class TemplateProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='tmpl'
async def path_call(self, request, params={}):
await self.set_run_env(request, params=params)
path = params.get('path', request.path)
ns = self.run_ns
te = self.run_ns['tmpl_engine']
return await te.render(path,**ns)
async def datahandle(self,request):
self.content = await self.path_call(request)
def setheaders(self):
super(TemplateProcessor,self).setheaders()
if self.path.endswith('.tmpl.css'):
self.headers['Content-Type'] = 'text/css; utf-8'
elif self.path.endswith('.tmpl.js'):
self.headers['Content-Type'] = 'application/javascript ; utf-8'
else:
self.headers['Content-Type'] = 'text/html; utf-8'
class BricksAppProcessor(TemplateProcessor):
@classmethod
def isMe(self,name):
return name=='app'
async def datahandle(self, request):
params = await self.resource.y_env['request2ns']()
await super().datahandle(request)
if params.get('_webbricks_',None):
return
txt = self.content
entire_url = self.run_ns.get('entire_url')
content0 = await self.resource.path_call(request,entire_url('/bricks/bricksapp.tmpl'))
ac = ArgsConvert("${", "}$")
self.content = ac.convert(content0, {'appdic':txt})
class BricksUIProcessor(TemplateProcessor):
@classmethod
def isMe(self,name):
# print(f'{name=} is a bui')
return name=='bui'
async def datahandle(self, request):
params = await self.resource.y_env['request2ns']()
await super().datahandle(request)
if params.get('_webbricks_',None):
return
txt = self.content
entire_url = self.run_ns.get('entire_url')
content0 = await self.resource.path_call(request,entire_url('/bricks/header.tmpl'))
content2 = await self.resource.path_call(request,entire_url('/bricks/footer.tmpl'))
self.content = f'{content0}{txt}{content2}'
debug(f'{len(txt)=}, {len(content0)=}, {len(content2)=}, {self.content=}')
class PythonScriptProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='dspy'
async def loadScript(self, path):
data = ''
async with aiofiles.open(path,'r', encoding='utf-8') as f:
data = await f.read()
b= ''.join(data.split('\r'))
lines = b.split('\n')
lines = ['\t' + l for l in lines ]
txt = "async def myfunc(request,**ns):\n" + '\n'.join(lines)
return txt
async def path_call(self, request,params={}):
await self.set_run_env(request, params=params)
lenv = self.run_ns
del lenv['request']
fpath = params.get('fpath', self.real_path)
txt = await self.loadScript(fpath)
# print(self.real_path, "#########", txt)
exec(txt,lenv,lenv)
func = lenv['myfunc']
return await func(request,**lenv)
async def datahandle(self,request):
self.content = await self.path_call(request)
class MarkdownProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='md'
async def datahandle(self,request:Request):
data = ''
async with aiofiles.open(self.real_path,'r',encoding='utf-8') as f:
data = await f.read()
self.content = self.urlreplace(data, request)
def urlreplace(self,mdtxt,request):
p = '\[(.*)\]\((.*)\)'
return re.sub(p,
lambda x:'['+x.group(1)+'](' + self.resource.entireUrl(request, x.group(2)) + ')',
mdtxt)
def getProcessor(name):
# print(f'getProcessor({name})')
return _getProcessor(BaseProcessor, name)
def _getProcessor(kclass,name):
for k in kclass.__subclasses__():
if not hasattr(k,'isMe'):
continue
if k.isMe(name):
return k
a = _getProcessor(k,name)
if a is not None:
return a
return None

View File

@ -0,0 +1,94 @@
import os,sys
from sys import platform
import time
import ssl
from socket import *
from aiohttp import web
from appPublic.folderUtils import ProgramPath
from appPublic.dictObject import DictObject
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from appPublic.registerfunction import RegisterCoroutine
from sqlor.dbpools import DBPools
from .processorResource import ProcessorResource
from .auth_api import AuthAPI
from .myTE import setupTemplateEngine
from .globalEnv import initEnv
from .serverenv import ServerEnv
from .filestorage import TmpFileRecord
from .loadplugins import load_plugins
class AHApp(web.Application):
def __init__(self, *args, **kw):
kw['client_max_size'] = 1024000000
super().__init__(*args, **kw)
self.user_data = DictObject()
def set_data(self, k, v):
self.user_data[k] = v
def get_data(self, k):
return self.user_data.get(k)
class ConfiguredServer:
def __init__(self, auth_klass=AuthAPI, workdir=None):
self.auth_klass = auth_klass
self.workdir = workdir
if self.workdir is not None:
pp = ProgramPath()
config = getConfig(self.workdir,
{'workdir':self.workdir,'ProgramPath':pp})
else:
config = getConfig()
if config.databases:
DBPools(config.databases)
self.config = config
initEnv()
setupTemplateEngine()
client_max_size = 1024 * 10240
if config.website.client_max_size:
client_max_size = config.website.client_max_size
self.app = AHApp(client_max_size=client_max_size)
load_plugins(self.workdir)
g = ServerEnv()
g.workdir = workdir
async def build_app(self):
rf = RegisterCoroutine()
await rf.exe('ahapp_built', self.app)
auth = self.auth_klass()
await auth.setupAuth(self.app)
return self.app
def run(self, port=None):
config = getConfig()
self.configPath(config)
a = TmpFileRecord()
ssl_context = None
if port is None:
port = config.website.port or 8080
if config.website.ssl:
ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ssl_context.load_cert_chain(config.website.ssl.crtfile,
config.website.ssl.keyfile)
reuse_port = None
if platform != 'win32':
reuse_port = True
print('reuse_port=', reuse_port)
web.run_app(self.build_app(),host=config.website.host or '0.0.0.0',
port=port,
reuse_port=reuse_port,
ssl_context=ssl_context)
def configPath(self,config):
for p,prefix in config.website.paths:
res = ProcessorResource(prefix,p,show_index=True,
follow_symlinks=True,
indexes=config.website.indexes,
processors=config.website.processors)
self.app.router.register_resource(res)

57
ahserver/dbadmin.py Normal file
View File

@ -0,0 +1,57 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web_routedef import AbstractRouteDef
from aiohttp.web import json_response
from sqlor.crud import CRUD
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from appPublic.log import info, debug, warning, error, critical, exception
from .error import Error,Success
actions = [
"browse",
"add",
"update",
"filter"
]
class DBAdmin:
def __init__(self, request,dbname,tablename, action):
self.dbname = dbname
self.tablename = tablename
self.request = request
self.action = action
if action not in actions:
debug('action not defined:%s' % action)
raise HTTPNotFound
try:
self.crud = CRUD(dbname,tablename)
except Exception as e:
exception('e= %s' % e)
traceback.print_exc()
raise HTTPNotFound
async def render(self) -> Response:
try:
d = await self.crud.I()
return json_response(Success(d))
except Exception as e:
exception('except=%s' % e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))

67
ahserver/dsProcessor.py Normal file
View File

@ -0,0 +1,67 @@
import codecs
import json
import aiofiles
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from .baseProcessor import BaseProcessor
from .serverenv import ServerEnv
class DataSourceProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='ds'
def __init__(self,filename,k):
super(DataSourceProcessor,self).__init__(filename,k)
self.actions = {
'getdata':self.getData,
'pagingdata':self.getPagingData,
'arguments':self.getArgumentsDesc,
'resultFields':self.getDataDesc,
'gridlist':self.getGridlist,
}
self.g = ServerEnv()
async def getData(self,dict_data,ns,request):pass
async def getPagingData(self,dict_data,ns,request):pass
async def getArgumentsDesc(self,dict_data,ns,request):pass
async def getDataDesc(self,dict_data,ns,request):pass
async def getGridlist(self,dict_data,ns,request):
ret = self.getDataDesc(dict_data,ns,request)
ffs = [ f for f in ret if f.get('frozen',False) ]
fs = [ f for f in ret if not f['frozen'] ]
[ f.update({'hide':True}) for f in ffs if f.get('listhide',False) ]
[ f.update({'hide':True}) for f in fs if f.get('listhide') ]
d = {
"iconCls":"icon-search",
"url":self.resource.absUrl(request,request.path + '?action=pagingdata'),
"view":"bufferview",
"options":{
"pageSize":50,
"pagination":False
}
}
d.update({'fields':fs})
if len(ffs)>0:
d.update({'ffields':ffs})
ret = {
"__ctmpl__":"datagrid",
"data":d
}
return ret
async def path_call(self, request, path, params={}):
dict_data = {}
config = getConfig()
async with aiofiles.open(path,'r',encoding=config.website.coding) as f:
b = await f.read()
dict_data = json.loads(b)
ns = self.run_ns
act = ns.get('action','getdata')
action = self.actions.get(act)
return await action(dict_data,ns,request)
async def datahandle(self,request):
self.content = await path_call(request, self.path)

27
ahserver/error.py Normal file
View File

@ -0,0 +1,27 @@
def Error(errno='undefined error',msg='Error'):
return {
"status":"Error",
"data":{
"message":msg,
"errno":errno
}
}
def Success(data):
return {
"status":"OK",
"data":data
}
def NeedLogin(path):
return {
"status":"need_login",
"data":path
}
def NoPermission(path):
return {
"status":"no_permission",
"data":path
}

53
ahserver/filedownload.py Normal file
View File

@ -0,0 +1,53 @@
import os
import asyncio
import aiofiles
import mimetypes
from aiohttp.web_exceptions import HTTPNotFound
from aiohttp.web import StreamResponse
from aiohttp import web
from appPublic.rc4 import RC4
from appPublic.registerfunction import RegisterFunction
from appPublic.log import debug
from .filestorage import FileStorage
crypto_aim = 'God bless USA and others'
def path_encode(path):
rc4 = RC4()
return rc4.encode(path,crypto_aim)
def path_decode(dpath):
rc4 = RC4()
return rc4.decode(dpath,crypto_aim)
async def file_upload(request):
pass
async def file_handle(request, filepath, download=False):
filename = os.path.basename(filepath)
debug(f'{filepath=}, {filename=}, {download=}')
headers = {}
if download:
headers = {
'Content-Disposition': f'attachment; filename="{filename}"'
}
r = web.FileResponse(filepath, chunk_size=8096, headers=headers)
r.enable_compression()
return r
async def file_download(request, filepath):
return await file_handle(request, filepath, download=True)
async def path_download(request, params_kw, *params, **kw):
path = params_kw.get('path')
download = False
if params_kw.get('download'):
download = True
fs = FileStorage()
fp = fs.realPath(path)
debug(f'path_download():download filename={fp}')
return await file_handle(request, fp, download)
rf = RegisterFunction()
rf.register('idfile', path_download)
rf.register('download', path_download)

151
ahserver/filestorage.py Normal file
View File

@ -0,0 +1,151 @@
# fileUpload.py
import asyncio
import os
import time
import tempfile
import aiofiles
import json
import time
from appPublic.folderUtils import _mkdir
from appPublic.jsonConfig import getConfig
from appPublic.Singleton import SingletonDecorator
from appPublic.log import info, debug, warning, exception, critical
@SingletonDecorator
class TmpFileRecord:
def __init__(self, timeout=3600):
self.filetime = {}
self.changed_flg = False
self.timeout = timeout
self.time_period = 10
self.filename = self.savefilename()
self.loop = asyncio.get_event_loop()
self.loop.call_later(0.01, self.load)
def newtmpfile(self, path:str):
self.filetime[path] = time.time()
self.change_flg = True
def savefilename(self):
config = getConfig()
root = config.filesroot or tempfile.gettempdir()
pid = os.getpid()
return root + f'/tmpfile_rec_{pid}.json'
async def save(self):
if not self.change_flg:
return
async with aiofiles.open(self.filename, 'bw') as f:
s = json.dumps(self.filetime, indent=4, ensure_ascii=False)
b = s.encode('utf-8')
await f.write(b)
await f.flush()
self.change_flg = False
async def load(self):
fn = self.filename
if not os.path.isfile(fn):
return
async with aiofiles.open(fn, 'br') as f:
b = await f.read()
s = b.decode('utf-8')
self.filetime = json.loads(s)
self.remove()
def file_useful(self, fpath):
try:
del self.filetime[fpath]
except Exception as e:
exception(f'Exception:{str(e)}')
pass
async def remove(self):
tim = time.time()
ft = {k:v for k,v in self.filetime.items()}
for k,v in ft:
if tim - v > self.timeout:
self.rmfile(k)
del self.tiletime[k]
await self.save()
self.loop.call_later(self.time_period, self.remove)
def rmfile(self, name:str):
config = getConfig()
os.remove(config.fileroot + name)
class FileStorage:
def __init__(self):
config = getConfig()
self.root = os.path.abspath(config.filesroot or tempfile.gettempdir())
self.tfr = TmpFileRecord()
def realPath(self,path):
if path[0] == '/':
path = path[1:]
p = os.path.abspath(os.path.join(self.root,path))
return p
def webpath(self, path):
if path.startswith(self.root):
return path[len(self.root):]
def _name2path(self,name, userid=None):
name = os.path.basename(name)
paths=[191,193,197,97]
v = int(time.time()*1000000)
# b = name.encode('utf8') if not isinstance(name,bytes) else name
# v = int.from_bytes(b,byteorder='big',signed=False)
root = self.root
if userid:
root += f'/{userid}'
path = os.path.abspath(os.path.join(root,
str(v % paths[0]),
str(v % paths[1]),
str(v % paths[2]),
str(v % paths[3]),
name))
return path
def remove(self, path):
try:
if path[0] == '/':
path = path[1:]
p = os.path.join(self.root, path)
os.remove(p)
except Exception as e:
exception(f'{path=}, {p=} remove error')
async def save(self,name,read_data, userid=None):
p = self._name2path(name, userid=userid)
fpath = p[len(self.root):]
info(f'{p=}, {fpath=},{self.root} ')
_mkdir(os.path.dirname(p))
if isinstance(read_data, str) or isinstance(read_data, bytes):
b = read_data
if isinstance(read_data, str):
b = read_data.encode('utf-8')
async with aiofiles.open(p, 'wb') as f:
await f.write(b)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
async with aiofiles.open(p,'wb') as f:
siz = 0
while 1:
d = await read_data()
if not d:
break
siz += len(d);
await f.write(d)
await f.flush()
self.tfr.newtmpfile(fpath)
return fpath
def file_realpath(path):
fs = FileStorage()
return fs.realPath(path)

14
ahserver/filetest.py Normal file
View File

@ -0,0 +1,14 @@
import os
def current_fileno():
fn = './t.txt'
f = open(fn, 'w')
ret = f.fileno()
f.close()
os.remove(fn)
return ret
if __name__ == '__main__':
for i in range(1000):
print(current_fileno())

View File

@ -0,0 +1,54 @@
import inspect
from appPublic.dictObject import DictObject
from appPublic.registerfunction import RegisterFunction
from appPublic.log import info, debug, warning, error, exception, critical
from aiohttp import web
from aiohttp.web_response import Response, StreamResponse
from .baseProcessor import BaseProcessor
class FunctionProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return False
def __init__(self,path,resource, opts):
self.config_opts = opts
BaseProcessor.__init__(self,path,resource)
async def path_call(self, request, params={}):
await self.set_run_env(request)
params_kw = self.run_ns.get('params_kw')
path = params.get('path', request.path)
path1 = path[len(self.config_opts['leading']):]
args = []
if len(path1) > 0:
if path1[0] == '/':
path1 = path1[1:]
args += path1.split('/')
rfname = self.config_opts['registerfunction']
ns = DictObject(**self.run_ns)
rf = RegisterFunction()
f = rf.get(rfname)
if f is None:
error(f'{rfname=} is not registered, {rf.registKW=}')
return None
# self.run_ns['request'] = request
# globals().update(self.run_ns)
env = {k:v for k,v in self.run_ns.items() if k not in ['params_kw', 'request'] }
if inspect.iscoroutinefunction(f):
return await f(request, params_kw, *args, **env)
return f(request, params_kw, *args, **env)
async def datahandle(self,request):
x = await self.path_call(request)
if isinstance(x,web.FileResponse):
self.retResponse = x
elif isinstance(x,Response):
self.retResponse = x
else:
self.content = x

284
ahserver/globalEnv.py Normal file
View File

@ -0,0 +1,284 @@
# -*- coding:utf8 -*-
import os
import builtins
import sys
import codecs
from urllib.parse import quote
import json
import asyncio
from aiohttp import BasicAuth
from traceback import format_exc
from functools import partial
import random
import time
import datetime
from openpyxl import Workbook
from tempfile import mktemp
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.Singleton import GlobalEnv
from appPublic.argsConvert import ArgsConvert
from appPublic.timeUtils import str2Date,str2Datetime,curDatetime, \
getCurrentTimeStamp,curDateString, curTimeString, \
monthfirstday, strdate_add, timestampstr
from appPublic.dataencoder import quotedstr
from appPublic.folderUtils import folderInfo
from appPublic.uniqueID import setNode,getID
from appPublic.unicoding import unicoding,uDict,uObject
from appPublic.Singleton import SingletonDecorator
from appPublic.rc4 import password, unpassword
from appPublic.registerfunction import RegisterFunction
from appPublic.httpclient import HttpClient
from appPublic.log import debug, exception
from appPublic.streamhttpclient import StreamHttpClient
from sqlor.dbpools import DBPools,runSQL,runSQLPaging
from sqlor.filter import DBFilter, default_filterjson
from aiohttp.web import StreamResponse
from .xlsxData import XLSXData
from .uriop import URIOp
from .error import Success, Error, NeedLogin, NoPermission
from .filetest import current_fileno
from .filedownload import path_download, file_download
from .filestorage import FileStorage
from .serverenv import ServerEnv
def basic_auth_headers(user, passwd):
ba = BasicAuth(login=user, password=passwd)
return {
"Authorization":ba.encode()
}
async def stream_response(request, async_data_generator, content_type='text/html'):
res = StreamResponse()
if content_type:
res.content_type = content_type
await res.prepare(request)
async for d in async_data_generator():
try:
if isinstance(d, bytes):
await res.write(d)
elif isinstance(d, str):
await res.write(d.encode('utf-8'))
else:
d = json.dumps(d, ensure_ascii=False)
await res.write(d.encode('utf-8'))
except Exception as e:
e = Exception(f'write error{e=}, {d=}')
exception(f'{e}\n{format_exc()}')
raise e
await res.drain()
await res.write_eof()
return res
def data2xlsx(rows,headers=None):
wb = Workbook()
ws = wb.active
i = 1
if headers is not None:
for j in range(len(headers)):
v = headers[j].title if headers[j].get('title',False) else headers[j].name
ws.cell(column=j+1,row=i,value=v)
i += 1
for r in rows:
for j in range(len(r)):
v = r[headers[j].name]
ws.cell(column=j+1,row=i,value=v)
i += 1
name = mktemp(suffix='.xlsx')
wb.save(filename = name)
wb.close()
return name
async def save_file(str_or_bytes, filename):
fs = FileStorage()
r = await fs.save(filename, str_or_bytes)
return r
def webpath(path):
fs = FileStorage()
return fs.webpath(path)
def realpath(path):
fs = FileStorage()
return fs.realPath(path)
class FileOutZone(Exception):
def __init__(self,fp,*args,**kwargs):
super(FileOutZone,self).__init__(*args,**kwargs)
self.openfilename = fp
def __str__(self):
return self.openfilename + ': not allowed to open'
def get_config_value(kstr):
keys = kstr.split('.')
config = getConfig()
if config is None:
raise Exception('getConfig() error')
for k in keys:
config = config.get(k)
if not config:
return None
return config
def get_definition(k):
k = f'definitions.{k}'
return get_config_value(k)
def openfile(url,m):
fp = abspath(url)
if fp is None:
print(f'openfile({url},{m}),url is not match a file')
raise Exception('url can not mathc a file')
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
fs = config.get('allow_folders',[])
fs = [ os.path.abspath(i) for i in fs + paths ]
r = False
for f in fs:
if fp.startswith(f):
r = True
break
if not r:
raise FileOutZone(fp)
return open(fp,m)
def isNone(a):
return a is None
def abspath(path):
config = getConfig()
paths = [ os.path.abspath(p) for p in config.website.paths ]
for root in paths:
p = root + path
if os.path.exists(root+path):
return p
return None
def appname():
config = getConfig()
try:
return config.license.app
except:
return "test app"
def configValue(ks):
config = getConfig()
try:
a = eval('config' + ks)
return a
except:
return None
def visualcoding():
return configValue('.website.visualcoding');
def file_download(request,path,name,coding='utf8'):
f = openfile(path,'rb')
b = f.read()
f.close()
fname = quote(name).encode(coding)
hah = b"attachment; filename=" + fname
# print('file head=',hah.decode(coding))
request.setHeader(b'Content-Disposition',hah)
request.setHeader(b'Expires',0)
request.setHeader(b'Cache-Control',b'must-revalidate, post-check=0, pre-check=0')
request.setHeader(b'Content-Transfer-Encoding',b'binary')
request.setHeader(b'Pragma',b'public')
request.setHeader(b'Content-Length',len(b))
request.write(b)
request.finish()
def paramify(data, ns):
ac = ArgsConvert('${', '}$')
return ac.convert(data, ns)
def get_password_key():
config = getConfig()
return config.password_key or 'QRIVSRHrthhwyjy176556332'
def password_encode(s):
k = get_password_key()
return password(s, key=k)
def password_decode(c):
k = get_password_key()
return unpassword(c, key=k)
def initEnv():
pool = DBPools()
g = ServerEnv()
set_builtins()
g.paramify = paramify
g.configValue = configValue
g.visualcoding = visualcoding
g.uriop = URIOp
g.isNone = isNone
g.json = json
g.ArgsConvert = ArgsConvert
g.time = time
g.curDateString = curDateString
g.curTimeString = curTimeString
g.datetime = datetime
g.random = random
g.str2date = str2Date
g.str2datetime = str2Datetime
g.timestampstr = timestampstr
g.monthfirstday = monthfirstday
g.curDatetime = curDatetime
g.strdate_add = strdate_add
g.uObject = uObject
g.uuid = getID
g.runSQL = runSQL
g.runSQLPaging = runSQLPaging
g.runSQLIterator = pool.runSQL
g.runSQLResultFields = pool.runSQLResultFields
g.getTables = pool.getTables
g.getTableFields = pool.getTableFields
g.getTablePrimaryKey = pool.getTablePrimaryKey
g.getTableForignKeys = pool.getTableForignKeys
g.folderInfo = folderInfo
g.abspath = abspath
g.data2xlsx = data2xlsx
g.xlsxdata = XLSXData
g.openfile = openfile
g.DBPools = DBPools
g.DBFilter = DBFilter
g.default_filterjson = default_filterjson
g.Error = Error
g.Success = Success
g.NeedLogin = NeedLogin
g.NoPermission = NoPermission
g.password_encode = password_encode
g.password_decode = password_decode
g.current_fileno = current_fileno
g.get_config_value = get_config_value
g.get_definition = get_definition
g.DictObject = DictObject
g.async_sleep = asyncio.sleep
g.quotedstr = quotedstr
g.save_file = save_file
g.realpath = realpath
g.format_exc = format_exc
g.basic_auth_headers = basic_auth_headers
g.HttpClient = HttpClient
g.rfexe = RegisterFunction().exe
g.stream_response = stream_response
g.webpath = webpath
g.file_download = file_download
g.path_download = path_download
g.partial = partial
g.StreamHttpClient = StreamHttpClient
def set_builtins():
all_builtins = [ i for i in dir(builtins) if not i.startswith('_')]
g = ServerEnv()
gg = globals()
for l in all_builtins:
exec(f'g["{l}"] = {l}',{'g':g})

81
ahserver/llmProcessor.py Normal file
View File

@ -0,0 +1,81 @@
import aiohttp
from aiohttp import web, BasicAuth
from aiohttp import client
from appPublic.dictObject import DictObject
from .llm_client import StreamLlmProxy, AsyncLlmProxy, SyncLlmProxy
from .baseProcessor import *
class LlmProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llm'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = StreamLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmSProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llms'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = SyncLlmProxy(self, d)
self.content = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass
class LlmAProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='llma'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = self.path
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
return DictObject(**data)
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
llm = AsyncLlmProxy(self, d)
self.retResponse = await llm(request, self.run_ns.params_kw)
def setheaders(self):
pass

277
ahserver/llm_client.py Normal file
View File

@ -0,0 +1,277 @@
import re
import base64
import json
from traceback import format_exc
from aiohttp import web
from appPublic.dictObject import DictObject
from appPublic.log import debug, info, exception, error
from appPublic.httpclient import HttpClient, RESPONSE_TEXT, RESPONSE_JSON, RESPONSE_BIN,RESPONSE_FILE, RESPONSE_STREAM
from appPublic.registerfunction import RegisterFunction
from appPublic.argsConvert import ArgsConvert
def encode_imagefile(fn):
with open(fn, 'rb') as f:
return base64.b64encode(f.read()).decode('utf-8')
class StreamLlmProxy:
def __init__(self, processor, desc):
assert desc.name
self.name = desc.name
self.processor = processor
self.auth_api = desc.auth
self.desc = desc
self.api_name = desc.name
self.data = DictObject()
self.ac = ArgsConvert('${', '}')
def line_chunk_match(self, l):
if self.api.chunk_match:
match = re.search(self.api.chunk_match, l)
if match:
return match.group(1)
return l
async def write_chunk(self, ll):
def eq(a, b):
return a == b
def ne(a, b):
return a != b
opfuncs = {
'==':eq,
'!=':ne
}
if '[DONE]' in ll:
return
try:
# print('write_chunk(),l=', ll)
l = self.line_chunk_match(ll)
d = DictObject(** json.loads(l))
j = {}
for r in self.api.resp or []:
j[r.name] = d.get_data_by_keys(r.value);
if self.api.chunk_filter:
v = d.get_data_by_keys(self.api.chunk_filter.name)
v1 = self.api.chunk_filter.value
op = self.api.chunk_filter.op
f = opfuncs.get(op)
if f and f(v,v1):
j[self.api.chunk_filter.field] = ''
jstr = json.dumps(j, indent=4, ensure_ascii=False) + '\n'
bin = jstr.encode('utf-8')
await self.resp.write(bin)
await self.resp.drain()
except Exception as e:
tb = format_exc()
exception(f'Error:Write_chunk(),{l=} error:{e=}{tb}')
async def stream_handle(self, chunk):
print('chunk=', chunk)
chunk = chunk.decode('utf-8')
chunk = self.remain_str + chunk
lines = chunk.split('\n')
self.remain_str = lines[-1]
ls = lines[:-1]
for l in ls:
if l == '':
continue
await self.write_chunk(l)
async def get_apikey(self, apiname):
f = self.processor.run_ns.get_llm_user_apikey
if f:
# return a DictObject instance
return await f(apiname, self.user)
raise Exception('get_llm_user_apikey() function not found in ServerEnv')
async def get_apidata(self, parts, params={}):
ret = {}
for d in parts or []:
v = d['value']
if params != {}:
v = self.datalize(v, params)
if d.get('convertor'):
rf = RegisterFunction()
v = await rf.exe(d['convertor'], v)
ret[d['name']] = v
return ret
async def do_auth(self, request):
d = self.desc.auth
self.data = self.get_data(self.name)
if self.data.authed:
return
self.data = await self.get_apikey(self.name)
if self.data is None:
raise Exception(f'user({self.user}) do not has a apikey for {self.name}')
params = self.data
method = d.get('method', 'POST')
_headers = self.get_apidata(d.get('headers', []), params)
_data = self.get_apidata(d.get('data', []), params)
_params = self.get_apidata(d.get('params',[]), params)
url = d.get('url')
hc = HttpClient()
resp_data = await hc.request(url, method, response_type=RESPONSE_JSON,
params=_params,
data=None if _data == {} else json.dumps(_data, indent=4, ensure_ascii=False),
headers=_headers)
resp_data = DictObject(**resp_data)
for sd in d.set_data:
self.data[sd.name] = resp_data.get_data_by_keys(sd.field)
self.data.authed = True
self.set_data(self.name, self.data)
def data_key(self, apiname):
if self.user is None:
self.user = 'anonymous'
return apiname + '_a_' + self.user
def set_data(self, apiname, data):
request = self.processor.run_ns.request
app = request.app
app.set_data(self.data_key(apiname), data)
def get_data(self, apiname):
request = self.processor.run_ns.request
app = request.app
return app.get_data(self.data_key(apiname))
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
url = d.get('url')
method = d.get('method', 'POST')
params1 = self.data
params1.update(params)
params = params1
method = d.get('method', 'POST')
_headers = self.get_apidata(d.get('headers', []), params)
_data = self.get_apidata(d.get('data', []), params)
_params = self.get_apidata(d.get('params',[]), params)
response_type = RESPONSE_STREAM
hc = HttpClient()
debug(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data, indent=4, ensure_ascii=False),
stream_func=self.stream_handle,
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp
def datalize(self, dic, data={}):
mydata = self.data.copy()
mydata.update(data)
s1 = self.ac.convert(dic, mydata)
return s1
class SyncLlmProxy(StreamLlmProxy):
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
if not self.desc[mapi]:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
d = self.desc[mapi]
self.api = d
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
method = d.get('method', 'POST')
url = d.get('url')
params1 = self.data
params1.update(params)
params = params1
method = d.get('method', 'POST')
_headers = self.get_apidata(d.get('headers', []), params)
_data = self.get_apidata(d.get('data', []), params)
_params = self.get_apidata(d.get('params',[]), params)
response_type = RESPONSE_JSON
hc = HttpClient()
debug(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data, indent=4, ensure_ascii=False),
headers=_headers)
debug(f'{resp_data=}')
if resp_data is None:
return {
"status":"Error",
"message":f'{mapi} not defined'
}
resp_data = DictObject(resp_data)
return self.convert_resp(resp_data)
def convert_resp(self, resp):
if self.api.resp is None:
return resp
j = {}
for r in self.api.resp or []:
j[r.name] = resp.get_data_by_keys(r.value);
return j
class AsyncLlmProxy(StreamLlmProxy):
async def __call__(self, request, params):
self.user = await self.processor.run_ns.get_user()
mapi = params.mapi
stream = params.stream
self.resp = web.StreamResponse()
await self.resp.prepare(request)
if stream is None:
stream = True
self.remain_str = ''
if not self.desc[mapi]:
raise Exception(f'{mapi} not defined')
d = self.desc[mapi]
self.api = d
self.chunk_match = d.chunk_match
if self.api.need_auth and self.auth_api:
await self.do_auth(request)
else:
self.data = await self.get_apikey(self.name)
assert d.get('url')
url = d.get('url')
method = d.get('method', 'POST')
params1 = self.data
params1.update(params)
params = params1
method = d.get('method', 'POST')
_headers = self.get_apidata(d.get('headers', []), params)
_data = self.get_apidata(d.get('data', []), params)
_params = self.get_apidata(d.get('params',[]), params)
response_type = RESPONSE_JSON
hc = HttpClient()
debug(f'{url=},{method=},{_params=},{_data=},{_headers=}')
resp_data = await hc.request(url, method, response_type=response_type,
params=_params,
data=None if _data == {} else json.dumps(_data, indent=4, ensure_ascii=False),
headers=_headers)
if self.remain_str != '':
await self.write_chunk(self.remain_str)
return self.resp

29
ahserver/loadplugins.py Normal file
View File

@ -0,0 +1,29 @@
import os
import sys
from appPublic.folderUtils import listFile
from appPublic.ExecFile import ExecFile
from ahserver.serverenv import ServerEnv
import appPublic
import sqlor
import ahserver
def load_plugins(p_dir):
ef = ExecFile()
pdir = os.path.join(p_dir, 'plugins')
if not os.path.isdir(pdir):
# print('load_plugins:%s not exists' % pdir)
return
sys.path.append(pdir)
ef.set('sys',sys)
ef.set('ServerEnv', ServerEnv)
for m in listFile(pdir, suffixs='.py'):
if m == '__init__.py':
continue
if not m.endswith('.py'):
continue
# print(f'{m=}')
module = os.path.basename(m[:-3])
# print('module=', module)
__import__(module, locals(), globals())

59
ahserver/myTE.py Normal file
View File

@ -0,0 +1,59 @@
import os
import codecs
from appPublic.Singleton import SingletonDecorator
from appPublic.jsonConfig import getConfig
from jinja2 import Template,Environment, BaseLoader
from .serverenv import ServerEnv
from .url2file import Url2File, TmplUrl2File
class TmplLoader(BaseLoader, TmplUrl2File):
def __init__(self, paths, indexes, subffixes=['.tmpl'], inherit=False):
BaseLoader.__init__(self)
TmplUrl2File.__init__(self,paths,indexes=indexes,subffixes=subffixes, inherit=inherit)
def get_source(self,env: Environment,template: str):
config = getConfig()
coding = config.website.coding
fp = self.url2file(template)
# print(f'{template=} can not transfer to filename')
if not os.path.isfile(fp):
raise TemplateNotFound(template)
mtime = os.path.getmtime(fp)
with codecs.open(fp,'r',coding) as f:
source = f.read()
return source,fp,lambda:mtime == os.path.getmtime(fp)
def join_path(self,name, parent):
return self.relatedurl(parent,name)
def list_templates(self):
return []
class TemplateEngine(Environment):
def __init__(self,loader=None):
Environment.__init__(self,loader=loader, enable_async=True)
self.urlpaths = {}
self.loader = loader
def join_path(self,template: str, parent: str):
return self.loader.join_path(template, parent)
async def render(self,___name: str, **globals):
t = self.get_template(___name,globals=globals)
return await t.render_async(globals)
def setupTemplateEngine():
config = getConfig()
subffixes = [ i[0] for i in config.website.processors if i[1] == 'tmpl' ]
loader = TmplLoader(config.website.paths,
config.website.indexes,
subffixes,
inherit=True)
engine = TemplateEngine(loader)
g = ServerEnv()
g.tmpl_engine = engine

View File

@ -0,0 +1,39 @@
from aiohttp import web
from p2psc.pubkey_handler import PubkeyHandler
from p2psc.p2psc import P2psc
class P2pLayer
def __init__(self):
self.p2pcrypt = False
config = getConfig()
if config.website.p2pcrypt:
self.p2pcrypt = True
if not self.p2pcrypt:
return
self.handler = PubkeyHandler()
self.p2p = P2psc(self.handler, self.handler.get_myid())
@web.middleware
async def p2p_middle(self, request, handler):
if not p2pscrypr:
return await handler(request)
if request.headers.get('P2pHandShake', None):
resturen await self.p2p_handshake(request)
if request.header.get('P2pdata', None):
request = await self.p2p_decode_request(request)
resp = await handler(request)
return await self.p2p_encode_response(resp)
return handler(request)
async def p2p_handshake(self, request):
pass
async def p2p_decode_request(self, request):
pass
async def p2p_encode_response(self, response):
return response

View File

@ -0,0 +1,476 @@
import os
import re
import codecs
import aiofiles
from traceback import print_exc
# from showcallstack import showcallstack
import asyncio
import json
from yarl import URL
import ssl
from aiohttp import client
from aiohttp_auth import auth
from appPublic.http_client import Http_Client
from functools import partial
from aiohttp_auth import auth
from aiohttp.web_urldispatcher import StaticResource, PathLike
from aiohttp.web_urldispatcher import Optional, _ExpectHandler
from aiohttp.web_urldispatcher import Path
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
HTTPFound,
)
from aiohttp.web_fileresponse import FileResponse
from aiohttp.web_request import Request
from aiohttp.web_response import Response, StreamResponse
from aiohttp.web_routedef import AbstractRouteDef
from aiohttp_session import get_session
from appPublic.jsonConfig import getConfig
from appPublic.dictObject import DictObject
from appPublic.i18n import getI18N
from appPublic.dictObject import DictObject, multiDict2Dict
from appPublic.timecost import TimeCost
from appPublic.timeUtils import timestampstr
from appPublic.log import clientinfo, info, debug, warning, error, critical, exception
from .baseProcessor import getProcessor, BricksUIProcessor, TemplateProcessor
from .baseProcessor import PythonScriptProcessor, MarkdownProcessor
from .xlsxdsProcessor import XLSXDataSourceProcessor
from .llmProcessor import LlmProcessor, LlmSProcessor, LlmAProcessor
from .websocketProcessor import WebsocketProcessor
from .xtermProcessor import XtermProcessor
from .sqldsProcessor import SQLDataSourceProcessor
from .functionProcessor import FunctionProcessor
from .proxyProcessor import ProxyProcessor
from .serverenv import ServerEnv
from .url2file import Url2File
from .filestorage import FileStorage, file_realpath
from .restful import DBCrud
from .dbadmin import DBAdmin
from .filedownload import file_download, path_decode
from .utils import unicode_escape
from .filetest import current_fileno
from .auth_api import user_login, user_logout, get_session_user, get_session_userinfo
def getHeaderLang(request):
al = request.headers.get('Accept-Language')
if al is None:
return 'en'
return al.split(',')[0]
def i18nDICT(request):
c = getConfig()
i18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(i18n.getLangDict(l)).encode(c.website.coding)
class ProcessorResource(StaticResource,Url2File):
def __init__(self, prefix: str, directory: PathLike,
*, name: Optional[str]=None,
expect_handler: Optional[_ExpectHandler]=None,
chunk_size: int=256 * 1024,
show_index: bool=False, follow_symlinks: bool=False,
append_version: bool=False,
indexes:list=[],
processors:dict={}) -> None:
StaticResource.__init__(self,prefix, directory,
name=name,
expect_handler=expect_handler,
chunk_size=chunk_size,
show_index=show_index,
follow_symlinks=follow_symlinks,
append_version=append_version)
Url2File.__init__(self,directory,prefix,indexes,inherit=True)
gr = self._routes.get('GET')
self._routes.update({'POST':gr})
self._routes.update({'PUT':gr})
self._routes.update({'OPTIONS':gr})
self._routes.update({'DELETE':gr})
self._routes.update({'TRACE':gr})
self.y_processors = processors
self.y_prefix = prefix
self.y_directory = directory
self.y_indexes = indexes
self.y_env = DictObject()
def setProcessors(self, processors):
self.y_processors = processors
def setIndexes(self, indexes):
self.y_indexes = indexes
def abspath(self, request, path:str):
url = self.entireUrl(request, path)
path = self.url2path(url)
fname = self.url2file(path)
return fname
async def getPostData(self,request: Request) -> DictObject:
qd = {}
if request.query:
qd = multiDict2Dict(request.query)
reader = None
try:
reader = await request.multipart()
except:
# print('reader is None')
pass
if reader is None:
pd = await request.post()
pd = multiDict2Dict(pd)
if pd == {}:
if request.can_read_body:
x = await request.read()
try:
pd = json.loads(x)
except:
# print('body is not a json')
pass
qd.update(pd)
return DictObject(**qd)
ns = qd
while 1:
try:
field = await reader.next()
if not field:
break
value = ''
if hasattr(field,'filename') and field.filename is not None:
saver = FileStorage()
userid = await get_session_user(request)
value = await saver.save(field.filename,field.read_chunk, userid=userid)
else:
value = await field.read(decode=True)
value = value.decode('utf-8')
ov = ns.get(field.name)
if ov:
if type(ov) == type([]):
ov.append(value)
else:
ov = [ov,value]
else:
ov = value
ns.update({field.name:ov})
# print(f'getPostData():{ns=}')
except Exception as e:
print(e)
print_exc()
print('-----------except out ------------')
break;
return DictObject(ns)
def parse_request(self, request):
"""
get real schema, host, port, prepath
and save it to self._{attr}
"""
self._scheme = request.scheme
self._scheme = request.headers.get('X-Forwarded-Scheme',request.scheme)
k = request.host.split(':')
host = k[0]
port = 80
if len(k) == 2:
port = int(k[1])
elif self._scheme.lower() == 'https':
port = 443
self._host = request.headers.get('X-Forwarded-Host', host)
self._port = request.headers.get('X-Forwarded-Port', port)
self._prepath = request.headers.get('X-Forwarded-Prepath', '')
if self._prepath != '':
self._prepath = '/' + self._prepath
self._preurl = f'{self._scheme}://{self._host}:{self._port}{self._prepath}'
# print(f'{request.path=}, {self._preurl=}')
async def _handle(self,request:Request) -> StreamResponse:
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def i18nDICT():
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
lang = getHeaderLang(request)
l = c.langMapping.get(lang,lang)
return json.dumps(g.myi18n.getLangDict(l))
def getClientType(request):
agent = request.headers.get('user-agent')
if type(agent)!=type('') and type(agent)!=type(b''):
return 'pc'
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'
def serveri18n(s):
lang = getHeaderLang(request)
c = getConfig()
g = ServerEnv()
if not g.get('myi18n',False):
g.myi18n = getI18N()
l = c.langMapping.get(lang,lang)
return g.myi18n(s,l)
async def getArgs() -> DictObject:
if request.method == 'POST':
return await self.getPostData(request)
ns = multiDict2Dict(request.query)
return DictObject(**ns)
async def redirect(url):
url = self.entireUrl(request, url)
raise HTTPFound(url)
async def remember_user(userid,
username='',
userorgid=''):
await user_login(request, userid,
username=username,
userorgid=userorgid)
async def getsession():
return await get_session(request)
async def remember_ticket(ticket):
await auth.remember_ticket(request, ticket)
async def get_ticket():
return await auth.get_ticket(request)
async def forget_user():
await user_logout(request)
async def get_username():
info = await get_session_userinfo(request)
return info.username
async def get_userinfo():
info = await get_session_userinfo(request)
return info
async def get_userorgid():
info = await get_session_userinfo(request)
return info.userorgid
async def get_user():
return await get_session_user(request)
self.parse_request(request)
self.y_env.i18n = serveri18n
self.y_env.get_session = getsession
self.y_env.file_realpath = file_realpath
self.y_env.redirect = redirect
self.y_env.info = info
self.y_env.error = error
self.y_env.debug = debug
self.y_env.clientinfo = clientinfo
self.y_env.warning = warning
self.y_env.critical = critical
self.y_env.exception = exception
self.y_env.remember_user = remember_user
self.y_env.forget_user = forget_user
self.y_env.get_user = get_user
self.y_env.get_username = get_username
self.y_env.get_userorgid = get_userorgid
self.y_env.get_userinfo = get_userinfo
self.y_env.i18nDict = i18nDICT
self.y_env.terminalType = getClientType(request)
self.y_env.entire_url = partial(self.entireUrl,request)
self.y_env.websocket_url = partial(self.websocketUrl,request)
self.y_env.abspath = self.abspath
self.y_env.request2ns = getArgs
self.y_env.aiohttp_client = client
self.y_env.resource = self
self.y_env.gethost = partial(self.gethost, request)
self.y_env.path_call = partial(self.path_call,request)
self.user = await auth.get_auth(request)
self.y_env.user = self.user
self.request_filename = self.url2file(str(request.path))
request['request_filename'] = self.request_filename
path = request.path
config = getConfig()
request['port'] = config.website.port
if config.website.dbadm and path.startswith(config.website.dbadm):
pp = path.split('/')[2:]
if len(pp)<3:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
action = pp[2]
adm = DBAdmin(request,dbname,tablename,action)
return await adm.render()
if config.website.dbrest and path.startswith(config.website.dbrest):
pp = path.split('/')[2:]
if len(pp)<2:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dbname = pp[0]
tablename = pp[1]
id = None
if len(pp) > 2:
id = pp[2]
crud = DBCrud(request,dbname,tablename,id=id)
return await crud.dispatch()
if config.website.download and path.startswith(config.website.download):
pp = path.split('/')[2:]
if len(pp)<1:
error('%s:not found' % str(request.url))
raise HTTPNotFound
dp = '/'.join(pp)
path = path_decode(dp)
return await file_download(request, path)
processor = self.url2processor(request, str(request.url), self.request_filename)
if processor:
ret = await processor.handle(request)
return ret
if self.request_filename and await self.isHtml(self.request_filename):
return await self.html_handle(request, self.request_filename)
if self.request_filename and os.path.isdir(self.request_filename):
config = getConfig()
if not config.website.allowListFolder:
error('%s:not found' % str(request.url))
raise HTTPNotFound
# print(f'{self.request_filename=}, {str(request.url)=} handle as a normal file')
return await super()._handle(request)
def gethost(self, request):
host = request.headers.get('X-Forwarded-Host')
if host:
return host
host = request.headers.get('Host')
if host:
return host
return '/'.join(str(request.url).split('/')[:3])
async def html_handle(self,request,filepath):
async with aiofiles.open(filepath,'r', encoding='utf-8') as f:
txt = await f.read()
utxt = txt.encode('utf-8')
headers = {
'Content-Type': 'text/html; utf-8',
'Accept-Ranges': 'bytes',
'Content-Length': str(len(utxt))
}
resp = Response(text=txt,headers=headers)
return resp
async def isHtml(self,fn):
try:
async with aiofiles.open(fn,'r',encoding='utf-8') as f:
b = await f.read()
while b[0] in ['\n',' ','\t']:
b = b[1:]
if b.lower().startswith('<html>'):
return True
if b.lower().startswith('<!doctype html>'):
return True
except Exception as e:
return False
def url2processor(self, request, url, fpath):
config = getConfig()
url1 = url
url = self.entireUrl(request, url)
host = '/'.join(url.split('/')[:3])
path = '/' + '/'.join(url.split('/')[3:])
if config.website.startswiths:
for a in config.website.startswiths:
leading = self.entireUrl(request, a.leading)
if path.startswith(a.leading):
processor = FunctionProcessor(path,self,a)
return processor
if fpath is None:
print(f'fpath is None ..., {url=}, {url1=}')
return None
for word, handlername in self.y_processors:
if fpath.endswith(word):
Klass = getProcessor(handlername)
try:
processor = Klass(path,self)
# print(f'{f_cnt1=}, {f_cnt2=}, {f_cnt3=}, {f_cnt4=}, {f_cnt5=}')
return processor
except Exception as e:
print('Exception:',e, 'handlername=', handlername)
return None
return None
def websocketUrl(self, request, url):
url = self.entireUrl(request, url)
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
def urlWebsocketify(self, url):
if url.endswith('.ws') or url.endswith('.wss'):
if url.startswith('https'):
return 'wss' + url[5:]
return 'ws' + url[4:]
return url
def entireUrl(self, request, url):
url = '/'.join(url.split('\\/'))
ret_url = ''
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
ret_url = url
elif url.startswith('/'):
u = f'{self._preurl}{url}'
# print(f'entireUrl(), {u=}, {url=}, {self._preurl=}')
ret_url = u
else:
path = request.path
p = self.relatedurl(path,url)
u = f'{self._preurl}{p}'
ret_url = u
return self.urlWebsocketify(ret_url)
def url2path(self, url):
if url.startswith(self._preurl):
return url[len(self._preurl):]
return url
async def path_call(self, request, path, params={}):
url = self.entireUrl(request, path)
debug(f'{path=}, after entireUrl(), {url=}')
path = self.url2path(url)
fpath = self.url2file(path)
processor = self.url2processor(request, path, fpath)
debug(f'path_call(), {path=}, {url=}, {fpath=}, {processor=}, {self._prepath}')
params['path'] = path
params['fpath'] = fpath
return await processor.be_call(request, params=params)

View File

@ -0,0 +1,61 @@
import aiohttp
from appPublic.log import info, debug, warning, error, critical, exception
# from appPublic.streamhttpclient import StreamHttpClient
from aiohttp import web, BasicAuth
from aiohttp import client
from .baseProcessor import *
class ProxyProcessor(BaseProcessor):
@classmethod
def isMe(self,name):
return name=='proxy'
async def path_call(self, request, params={}):
await self.set_run_env(request)
path = params.get('path', request.path)
url = self.resource.entireUrl(request, path)
ns = self.run_ns
ns.update(params)
te = self.run_ns['tmpl_engine']
txt = await te.render(url,**ns)
data = json.loads(txt)
debug('proxyProcessor: data=%s' % data)
return data
async def datahandle(self,request):
chunk_size = 40960
d = await self.path_call(request)
reqH = request.headers.copy()
auth = None
if d.get('user') and d.get('password'):
auth = BasicAuth(d['user'], d['password'])
if g.get('headers'):
regH.update(d['headers'])
params = None
if d.get('params'):
params = params
async with client.request(
d.get('method', request.method),
d['url'],
auth=auth,
headers = reqH,
allow_redirects=False,
params=paams,
data=await request.read()) as res:
headers = res.headers.copy()
# body = await res.read()
self.retResponse = web.StreamResponse(
headers = headers,
status = res.status
# ,body=body
)
await self.retResponse.prepare(request)
async for chunk in res.content.iter_chunked(chunk_size):
await self.retResponse.write(chunk)
debug('proxy: datahandle() finish')
def setheaders(self):
pass

121
ahserver/restful.py Normal file
View File

@ -0,0 +1,121 @@
import os
import re
import traceback
from aiohttp.web_response import Response
from aiohttp.web_exceptions import (
HTTPException,
HTTPExpectationFailed,
HTTPForbidden,
HTTPMethodNotAllowed,
HTTPNotFound,
)
from aiohttp import web
from aiohttp.web_request import Request
from aiohttp.web import json_response
from sqlor.dbpools import DBPools
from appPublic.dictObject import multiDict2Dict
from appPublic.jsonConfig import getConfig
from .error import Error,Success
DEFAULT_METHODS = ('GET', 'POST', 'PUT', 'DELETE', 'HEAD', 'OPTIONS', 'TRACE')
class RestEndpoint:
def __init__(self):
self.methods = {}
for method_name in DEFAULT_METHODS:
method = getattr(self, method_name.lower(), None)
if method:
self.register_method(method_name, method)
def register_method(self, method_name, method):
self.methods[method_name.upper()] = method
async def dispatch(self):
method = self.methods.get(self.request.method.lower())
if not method:
raise HTTPMethodNotAllowed('', DEFAULT_METHODS)
return await method()
class DBCrud(RestEndpoint):
def __init__(self, request,dbname,tablename, id=None):
super().__init__()
self.dbname = dbname
self.tablename = tablename
self.request = request
self.db = DBPools()
self.id = id
async def options(self) -> Response:
try:
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.I(self.tablename)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='metaerror',msg='get metadata error'))
async def get(self) -> Response:
"""
query data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.R(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='search error',msg='search error'))
async def post(self):
"""
insert data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.C(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='add error',msg='add error'))
async def put(self):
"""
update data
"""
try:
ns = multiDict2Dict(await self.request.post())
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.U(self.tablename, ns)
return json_response(Success(' '))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(errno='update error',msg='update error'))
async def delete(self, request: Request, instance_id):
"""
delete data
"""
try:
ns = multiDict2Dict(self.request.query)
with self.db.sqlorContext(self.dbname) as sor:
d = await sor.D(self.tablename, ns)
return json_response(Success(d))
except Exception as e:
print(e)
traceback.print_exc()
return json_response(Error(erron='delete error',msg='error'))

32
ahserver/serverenv.py Normal file
View File

@ -0,0 +1,32 @@
from appPublic.Singleton import SingletonDecorator
from appPublic.dictObject import DictObject
@SingletonDecorator
class ServerEnv(DictObject):
pass
def get_serverenv(name):
g = ServerEnv()
return g.get(name)
def set_serverenv(name, value):
g = ServerEnv()
g[name] = value
clientkeys = {
"iPhone":"iphone",
"iPad":"ipad",
"Android":"androidpad",
"Windows Phone":"winphone",
"Windows NT[.]*Win64; x64":"pc",
}
def getClientType(request):
agent = request.headers.get('user-agent')
for k in clientkeys.keys():
m = re.findall(k,agent)
if len(m)>0:
return clientkeys[k]
return 'pc'

View File

@ -0,0 +1,75 @@
import codecs
from .dsProcessor import DataSourceProcessor
from appPublic.jsonConfig import getConfig
from sqlor.dbpools import DBPools
import json
"""
sqlds file format:
{
"sqldesc":{
"sql_string":"select * from dbo.stock_daily_hist where stock_num=${stock_num}$ order by trade_date desc",
"db":"mydb",
"sortfield":"stock_date"
}
"arguments":[
{
"name":"stock_num",
"type":"str",
"iotype":"text",
"default":"600804"
}
],
"datadesc":[
{
}
]
}
"""
class SQLDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='sqlds'
def getArgumentsDesc(self,dict_data,ns,request):
desc = dict_data.get('arguments',None)
return desc
async def getDataDesc(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLResultFields
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
# print('sql(),sqldesc=',sqldesc)
return sqldesc
rec = dict_data.get('datadesc',None)
if rec is None:
sqldesc = dict_data.get('sqldesc')
ns = dict_data.get('arguments',{})
rec = [ r for r in sql(sqldesc['db'],ns) if r['name']!='_row_id' ]
dict_data['datadesc'] = rec
f = codecs.open(self.src_file,'w',self.config.website.coding)
b = json.dumps(dict_data, indent=4, ensure_ascii=False)
f.write(b)
f.close()
return rec
async def getData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQL
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = [ i for i in await sql(db,ns) ]
return ret
async def getPagingData(self,dict_data,ns,request):
pool = DBPools()
@pool.runSQLPaging
def sql(dbname,NS):
sqldesc = dict_data.get('sqldesc')
return sqldesc
db = dict_data['sqldesc']['db']
ret = await sql(db,ns)
return ret

83
ahserver/uriop.py Normal file
View File

@ -0,0 +1,83 @@
#
import os
import codecs
from appPublic.jsonConfig import getConfig
from appPublic.folderUtils import folderInfo
class URIopException(Exception):
def __init__(self,errtype,errmsg):
self.errtype = errtype
self.errmsg = errmsg
super(URIopException,self).init('errtype=%s,errmsg=%s' % (errtype,errmsg))
def __str__(self):
return 'errtype=%s,errmsg=%s' % (self.errtype,self.errmsg)
class URIOp(object):
def __init__(self):
self.conf = getConfig()
self.realPath = os.path.abspath(self.conf.website.root)
def abspath(self,uri=None):
p = self.conf.website.root
if uri is not None and len(uri)>0:
x = uri
if x[0] == '/':
x = x[1:]
p = os.path.join(p,*x.split('/'))
d = os.path.abspath(p)
if len(d) < len(self.realPath):
raise URIopException('url scope error',uri);
if d[:len(self.realPath)] != self.realPath:
raise URIopException('url scope error',uri);
return d
def fileList(self,uri=''):
r = [ i for i in folderInfo(self.realPath,uri) ]
for i in r:
if i['type']=='dir':
i['state'] = 'closed'
i['id'] = '_#_'.join(i['id'].split('/'))
ret={
'total':len(r),
'rows':r
}
return ret
def mkdir(self,at_uri,name):
p = self.abspath(at_uri)
p = os.path.join(p,name)
os.mkdir(p)
def rename(self,uri,newname):
p = self.abspath(uri)
dir = os.path.dirname(p)
np = os.path.join(p,newname)
os.rename(p,np)
def delete(self,uri):
p = self.abspath(uri)
os.remove(p)
def save(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()
def read(self,uri):
p = self.abspath(uri)
f = codecs.open(p,"r",self.conf.website.coding)
b = f.read()
f.close()
return b
def write(self,uri,data):
p = self.abspath(uri)
f = codecs.open(p,"w",self.conf.website.coding)
f.write(data)
f.close()

114
ahserver/url2file.py Normal file
View File

@ -0,0 +1,114 @@
import os
class Url2File:
def __init__(self,path:str,prefix: str,
indexes: list, inherit: bool=False):
self.rootpath = path
self.starts = prefix
self.indexes = indexes
self.inherit = inherit
def realurl(self,url:str) -> str :
items = url.split('/')
items = [ i for i in items if i != '.' ]
while '..' in items:
for i,v in enumerate(items):
if v=='..' and i > 0:
del items[i]
del items[i-1]
break
return '/'.join(items)
def url2ospath(self, url: str) -> str:
url = url.split('?')[0]
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
paths = url.split('/')
if url.startswith('http://') or \
url.startswith('https://') or \
url.startswith('ws://') or \
url.startswith('wss://'):
paths = paths[3:]
f = os.path.join(self.rootpath,*paths)
real_path = os.path.abspath(f)
# print(f'{real_path=}, {url=}, {f=}')
return real_path
def url2file(self, url: str) -> str:
ourl = url
url = url.split('?')[0]
real_path = self.url2ospath(url)
if os.path.isdir(real_path):
for idx in self.indexes:
p = os.path.join(real_path,idx)
if os.path.isfile(p):
# print(f'{url=}, {real_path=}, {idx=}, {p=}')
return p
if os.path.isfile(real_path):
return real_path
if not os.path.isdir(os.path.dirname(real_path)):
# print(f'url2file() return None, {real_path=}, {url=},{ourl=}, {self.rootpath=}')
return None
if not self.inherit:
# print(f'url2file() return None, self.inherit is false, {url:}, {self.rootpath=}')
return None
items = url.split('/')
if len(items) > 2:
del items[-2]
oldurl = url
url = '/'.join(items)
# print(f'{oldurl=}, {url=}')
return self.url2file(url)
# print(f'url2file() return None finally, {items:}, {url=}, {ourl=}, {self.rootpath=}')
return None
def relatedurl(self,url: str, name: str) -> str:
if len(url) > 0 and url[-1] == '/':
url = url[:-1]
fp = self.url2ospath(url)
if os.path.isfile(fp):
items = url.split('/')
del items[-1]
url = '/'.join(items)
url = url + '/' + name
return self.realurl(url)
def relatedurl2file(self,url: str, name: str):
url = self.relatedurl(url,name)
return self.url2file(url)
class TmplUrl2File:
def __init__(self,paths,indexes, subffixes=['.tmpl','.ui' ],inherit=False):
self.paths = paths
self.u2fs = [ Url2File(p,prefix,indexes,inherit=inherit) \
for p,prefix in paths ]
self.subffixes = subffixes
def url2file(self,url):
for u2f in self.u2fs:
fp = u2f.url2file(url)
if fp:
return fp
return None
def relatedurl(self,url: str, name: str) -> str:
for u2f in self.u2fs:
fp = u2f.relatedurl(url, name)
if fp:
return fp
return None
def list_tmpl(self):
ret = []
for rp,_ in self.paths:
p = os.path.abspath(rp)
[ ret.append(i) for i in listFile(p,suffixs=self.subffixes,rescursive=True) ]
return sorted(ret)

4
ahserver/utils.py Normal file
View File

@ -0,0 +1,4 @@
def unicode_escape(s):
x = [ch if ord(ch) < 256 else ch.encode('unicode_escape').decode('utf-8') for ch in s]
return ''.join(x)

1
ahserver/version.py Normal file
View File

@ -0,0 +1 @@
__version__ = '0.3.4'

40
ahserver/webapp.py Normal file
View File

@ -0,0 +1,40 @@
import os, sys
import argparse
from appPublic.log import MyLogger, info, debug, warning
from appPublic.folderUtils import ProgramPath
from appPublic.jsonConfig import getConfig
from ahserver.configuredServer import ConfiguredServer
from ahserver.serverenv import ServerEnv
from appPublic.jsonConfig import getConfig
def webapp(init_func):
parser = argparse.ArgumentParser(prog="Sage")
parser.add_argument('-w', '--workdir')
parser.add_argument('-p', '--port')
args = parser.parse_args()
workdir = args.workdir or os.getcwd()
port = args.port
webserver(init_func, workdir, port)
def webserver(init_func, workdir, port=None):
p = ProgramPath()
config = getConfig(workdir, NS={'workdir':workdir, 'ProgramPath':p})
if config.logger:
logger = MyLogger(config.logger.name or 'webapp',
levelname=config.logger.levelname or 'info',
logfile=config.logger.logfile or None)
else:
logger = MyLogger('webapp', levelname='info')
init_func()
se = ServerEnv()
se.workdir = workdir
se.port = port
server = ConfiguredServer(workdir=workdir)
port = port or config.website.port or 8080
port = int(port)
server.run(port=port)
if __name__ == '__main__':
from main import main
webapp(main)

View File

@ -0,0 +1,186 @@
import asyncio
import aiohttp
import aiofiles
import json
import codecs
from aiohttp import web
import aiohttp_cors
from traceback import print_exc
from appPublic.sshx import SSHNode
from appPublic.dictObject import DictObject
from appPublic.log import info, debug, warning, error, exception, critical
from .baseProcessor import BaseProcessor, PythonScriptProcessor
async def ws_send(ws:web.WebSocketResponse, data):
info(f'data={data} {ws=}')
d = {
"type":1,
"data":data
}
d = json.dumps(d, indent=4, ensure_ascii=False)
try:
return await ws.send_str(d)
except Exception as e:
exception(f'ws.send_str() error: {e=}')
print_exc()
return False
class WsSession:
def __init__(self, session):
self.session = session
self.nodes = {}
def join(node):
self.nodes[node.id] = node
def leave(node):
self.nodes = {k:v for k,v in self.nodes.items() if k != node.id}
class WsData:
def __init__(self):
self.nodes = {}
self.sessions = {}
def add_node(self, node):
self.nodes[node.id] = node
def del_node(self, node):
self.nodes = {k:v for k,v in self.nodes.items() if k!=node.id}
def get_nodes(self):
return self.nodes
def get_node(self, id):
return self.nodes.get(id)
def add_session(self, session):
self.sessions[session.sessionid] = session
def del_session(self, session):
self.sessions = {k:v for k,v in self.sessions.items() if k != session.sessionid}
def get_session(self, id):
return self.sessions.get(id)
class WsPool:
def __init__(self, ws, ip, ws_path, app):
self.app = app
self.ip = ip
self.id = None
self.ws = ws
self.ws_path = ws_path
def get_data(self):
r = self.app.get_data(self.ws_path)
if r is None:
r = WsData()
self.set_data(r)
return r
def set_data(self, data):
self.app.set_data(self.ws_path, data)
def is_online(self, userid):
data = self.get_data()
node = data.get_node(userid)
if node is None:
return False
return True
def register(self, id):
iddata = DictObject()
iddata.id = id
self.add_me(iddata)
def add_me(self, iddata):
data = self.get_data()
iddata.ws = self.ws
iddata.ip = self.ip
self.id = iddata.id
data.add_node(iddata)
self.set_data(data)
def delete_id(self, id):
data = self.get_data()
node = data.get_node(id)
if node:
data.del_node(node)
self.set_data(data)
def delete_me(self):
self.delete_id(self.id)
def add_session(self, session):
data = self.get_data()
data.add_session(session)
self.set_data(data)
def del_session(self, session):
data = self.get_data()
data.del_session(session)
self.set_data(data)
def get_session(self, sessionid):
data = self.get_data()
return data.get_session(sessionid)
async def sendto(self, data, id=None):
if id is None:
return await ws_send(self.ws, data)
d = self.get_data()
iddata = d.get_node(id)
ws = iddata.ws
try:
return await ws_send(ws, data)
except:
self.delete_id(id)
class WebsocketProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='ws'
async def path_call(self, request,params={}):
cookie = request.headers.get('Sec-WebSocket-Protocol', None)
if cookie:
request.headers['Cookies'] = cookie
userid = await get_user()
debug(f'{cookie=}, {userid=}')
await self.set_run_env(request)
lenv = self.run_ns.copy()
lenv.update(params)
params_kw = lenv.params_kw
userid = lenv.params_kw.userid or await lenv.get_user()
del lenv['request']
txt = await self.loadScript(self.real_path)
ws = web.WebSocketResponse()
try:
await ws.prepare(request)
except Exception as e:
exception(f'--------except: {e}')
print_exc()
raise e
ws_pool = WsPool(ws, request['client_ip'], request.path, request.app)
debug(f'========== debug ===========')
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
if msg.data == '_#_heartbeat_#_':
await ws_send(ws, '_#_heartbeat_#_')
else:
lenv['ws_data'] = msg.data
lenv['ws_pool'] = ws_pool
exec(txt,lenv,lenv)
func = lenv['myfunc']
resp = await func(request,**lenv)
elif msg.type == aiohttp.WSMsgType.ERROR:
error('ws connection closed with exception %s' % ws.exception())
break
else:
info('datatype error', msg.type)
debug(f'========== ws connection end ===========')
ws_pool.delete_me()
self.retResponse = ws
await ws.close()
return ws

128
ahserver/xlsxData.py Normal file
View File

@ -0,0 +1,128 @@
from openpyxl import load_workbook
import json
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs_at":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXData:
def __init__(self,path,desc):
self.desc = desc
self.xlsxfile = path
self.workbook = load_workbook(self.xlsxfile)
self.ws = self.workbook[self.desc['data_sheet']]
def getBaseFieldsInfo(self):
ws = self.workbook[self.desc['data_sheet']]
ret = []
for y in range(1,ws.max_column+1):
r = {
'name':self._fieldName(ws,y),
'label':self._fieldLabel(ws,y),
'type':self._fieldType(ws,y),
'listhide':self._isListHide(ws,y),
'inputhide':self._isInputHide(ws,y),
'frozen':self._isFrozen(ws,y)
}
r.update(self._fieldIOattrs(ws,y))
ret.append(r)
return ret
def _fieldName(self,ws,i):
x = self.desc.get('name_at')
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldLabel(self,ws,i):
x = self.desc.get('label_at',1)
if x is not None:
return ws.cell(x,i).value
return 'f' + str(i)
def _fieldType(self,ws,i):
x = self.desc.get('datatype_at')
if x is not None:
return ws.cell(x,i).value
return 'str'
def _fieldIOattrs(self,ws,i):
x = self.desc.get('ioattrs_at')
if x is not None:
t = ws.cell(x,i).value
if t is not None:
try:
return json.loads(t,'utf-8')
except Exception as e:
print('xlsxData.py:field=',i,'t=',t,'error')
return {}
def _isFrozen(self,ws,i):
x = self.desc.get('frozen_at')
if x is not None:
t = ws.cell(x,y).value
if t == 'Y' or t == 'y':
return True
return False
def _isListHide(self,ws,i):
x = self.desc.get('listhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def _isInputHide(self,ws,i):
x = self.desc.get('inputhide_at')
if x is not None:
t = ws.cell(x,i).value
if t == 'Y' or t == 'y':
return True
return False
def getPeriodData(self,min_r,max_r):
ws = self.ws
rows = []
assert(min_r >= self.desc.get('data_from',2))
if max_r > ws.max_row:
max_r = ws.max_row + 1;
if min_r <= max_r:
x = min_r;
while x < max_r:
d = {}
for y in range(1,ws.max_column+1):
name = self._fieldName(ws,y)
d.update({name:ws.cell(column=y,row=x).value})
rows.append(d)
x = x + 1
return rows
def getArgumentsDesc(self,ns,request):
return None
def getData(self,ns):
ws = self.ws
min_r = self.desc.get('data_from',2)
return self.getPeriodData(min_r,ws.max_row + 1)
def getPagingData(self,ns):
rows = int(ns.get('rows',50))
page = int(ns.get('page',1))
d1 = self.desc.get('data_from',2)
min_r = (page - 1) * rows + d1
max_r = page * rows + d1 + 1
rows = self.getPeriodData(min_r,max_r)
ret = {
'total':self.ws.max_row - d1,
'rows':rows
}
return ret

View File

@ -0,0 +1,51 @@
import codecs
from openpyxl import load_workbook
from appPublic.jsonConfig import getConfig
from .dsProcessor import DataSourceProcessor
from .xlsxData import XLSXData
"""
xlsxds file format:
{
"xlsxfile":"./data.xlsx",
"data_from":7,
"data_sheet":"Sheet1",
"label_at",1,
"name_at":null,
"datatype_at":2,
"ioattrs":3,
"listhide_at":4,
"inputhide_at":5,
"frozen_at":6
}
"""
class XLSXDataSourceProcessor(DataSourceProcessor):
@classmethod
def isMe(self,name):
return name=='xlsxds'
def getArgumentsDesc(self,dict_data,ns,request):
return None
async def getDataDesc(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getBaseFieldsInfo(ns)
return ret
async def getData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(self.g.absurl(request,path)),dict_data)
ret = self.xlsxdata.getData(ns)
return ret
async def getPagingData(self,dict_data,ns,request):
path = dict_data.get('xlsxfile',None)
self.xlsxdata = XLSXData(self.g.abspath(ns.absurl(request,path)),dict_data)
ret = self.xlsxdata.getPagingData(ns)
return ret

View File

@ -0,0 +1,90 @@
import asyncio
import aiohttp
import aiofiles
import json
import codecs
from aiohttp import web
import aiohttp_cors
from traceback import print_exc
from appPublic.sshx import SSHServer
from appPublic.dictObject import DictObject
from appPublic.log import info, debug, warning, error, exception, critical
from .baseProcessor import BaseProcessor, PythonScriptProcessor
class XtermProcessor(PythonScriptProcessor):
@classmethod
def isMe(self,name):
return name=='xterm'
async def ws_2_process(self, ws):
async for msg in ws:
if msg.type == aiohttp.WSMsgType.TEXT:
debug(f'recv from ws:{msg}')
resize_pattern = '_#_resize_#_'
heartbeat_pattern = '_#_heartbeat_#_'
if msg.data.startswith(resize_pattern):
row, col = [ int(i) for i in msg.data[len(resize_pattern):].split(',')]
await self.p_obj.set_terminal_size(row, col)
continue
if msg.data == heartbeat_pattern:
await self.ws_sendstr(ws, heartbeat_pattern)
continue
self.p_obj.stdin.write(msg.data)
elif msg.type == aiohttp.WSMsgType.ERROR:
# print('ws connection closed with exception %s' % ws.exception())
return
async def process_2_ws(self, ws):
while self.running:
x = await self.p_obj.stdout.read(1024)
await self.ws_sendstr(ws, x)
await asyncio.sleep(0)
async def datahandle(self,request):
await self.path_call(request)
async def path_call(self, request, params={}):
#
# xterm file is a python script as dspy file
# it must return a DictObject with sshnode information
# parameters: nodeid
#
await self.set_run_env(request, params=params)
login_info = await super().path_call(request, params=params)
if login_info is None:
raise Exception('data error')
debug(f'{login_info=}')
ws = web.WebSocketResponse()
await ws.prepare(request)
await self.run_xterm(ws, login_info)
self.retResponse = ws
return ws
async def run_xterm(self, ws, login_info):
# id = lenv['params_kw'].get('termid')
self.sshnode = SSHServer(login_info)
async with self.sshnode.get_connector() as conn:
self.running = True
self.p_obj = await conn.create_process(term_type='xterm', term_size=(24, 80))
stdin_task = asyncio.create_task(self.ws_2_process(ws))
try:
while self.running:
x = await self.p_obj.stdout.read(1024)
await self.ws_sendstr(ws, x)
except (asyncio.CancelledError, EOFError):
pass
finally:
self.p_obj.close()
stdin_task.cancel()
async def ws_sendstr(self, ws:web.WebSocketResponse, s:str):
data = {
"type":1,
"data":s
}
debug(f'{data=}')
await ws.send_str(json.dumps(data, indent=4, ensure_ascii=False))
debug(f'{data=} sended')

4
build.sh Executable file
View File

@ -0,0 +1,4 @@
rm dist/*.whl
python setup.py install
python setup.py bdist_wheel
python -m twine upload --repository-url https://upload.pypi.org/legacy/ dist/*.whl

5
change.log Executable file
View File

@ -0,0 +1,5 @@
# 2023-06-18
change permission check to a single function:checkUserPermission for auth_api.py
# 2023-06-22
fix bug for getPostData to support post data in request.query

3
changelog.txt Executable file
View File

@ -0,0 +1,3 @@
2023-06-12
modify auth_api.py to with only needAuth(), checkUserPermission(user, path) method

4
pyproject.toml Normal file
View File

@ -0,0 +1,4 @@
[build-system]
requires = ["setuptools>=61", "wheel"]
build-backend = "setuptools.build_meta"

31
setup.cfg Normal file
View File

@ -0,0 +1,31 @@
# setup.cfg
[metadata]
name = ahserver
version = 1.0.5
description = A application server base on aiohttp
author = yu moqing
author_email = yumoqing@gmail.com
license = MIT
[options]
packages = find:
python_requires = >=3.8
install_requires =
redis
asyncio
aiofiles
aiodns
aiohttp==3.10.10
aiohttp_session
aiohttp_auth_autz
aiohttp-cors
aiomysql
aioredis
psycopg2-binary
aiopg
jinja2
ujson
openpyxl
pillow
py-natpmp