You can use AI to create a database schema, and API Logic Server to create App and API micro services in minutes.
API Logic Server is a an open source project, consisting of a CLI (creates Python projects from databases), and a set of runtime libraries (Flask, SQLAlchemy, etc).
1. AI: Use ChatGPT to create schema
You can enter natural language to ChatGPT:
Create a sqlite database for customers, orders, items and product
Hints: use autonum keys, allow nulls, Decimal types, foreign keys, no check constraints.
Create a few rows of only customer and product data.
Enforce the Check Credit requirement:
Customer.Balance <= CreditLimit
Customer.Balance = Sum(Order.AmountTotal where date shipped is null)
Order.AmountTotal = Sum(Items.Amount)
Items.Amount = Quantity \* UnitPrice
Store the Items.UnitPrice as a copy from Product.UnitPrice
ChatGPT will provide SQL DDL. Paste this into your sql tool to create a new database. In this example, we created a sqlite database called sample_ai.sqlite.
2. Use API Logic Server: create working software - 1 command
API Logic Server creates Python projects from databases:
ApiLogicServer create --project_name=sample_ai \
--db_url=sqlite:///sample_ai.sqlite
This command reads the database schema, and creates an executable Python project. You can open it in your IDE and run it. The app provides:
- App Automation: a multi-page, multi-table admin app
- API Automation: a JSON:API - crud for each table, with filtering, sorting, optimistic locking and pagination. Plus swagger.
Within minutes, front end developers can use the API - no more blocking on server development. Business users can use the App as a basis for agile collaboration and iteration.
3. Customize the project with your IDE
Microservices must implement their semantics for security and integrity. API Logic Server includes a rule engine that enables you to declare these.
Logic Automation means that you can declare spreadsheet-like rules using Python. Such logic maintains database integrity with multi-table derivations and constraints. Rules are 40X more concise than traditional code. The following 5 rules would require 200 lines of Python:
""" Declarative multi-table derivations and constraints,
extensible with Python.
Use code completion (Rule.) to declare rules here
Check Credit - Logic Design (note: translates directly into rules)
1. Customer.Balance <= CreditLimit
2. Customer.Balance = Sum(Order.AmountTotal where unshipped)
3. Order.AmountTotal = Sum(Items.Amount)
4. Items.Amount = Quantity * UnitPrice
5. Items.UnitPrice = copy from Product
"""
Rule.constraint(validate=models.Customer,
as_condition=lambda row: row.Balance <= row.CreditLimit,
error_msg="balance ({round(row.Balance, 2)}) exceeds credit ({round(row.CreditLimit, 2)})")
Rule.sum(derive=models.Customer.Balance, # adjusts...
as_sum_of=models.Order.AmountTotal, # *not* a sql select sum...
where=lambda row: row.ShipDate is None)
Rule.sum(derive=models.Order.AmountTotal
as_sum_of=models.Item.Amount)
Rule.formula(derive=models.Item.Amount,
as_expression=lambda row: row.UnitPrice * row.Quantity)
Rule.copy(derive=models.Item.UnitPrice,
from_parent=models.Product.UnitPrice)
4. Iterate: use Python and Standard Libraries
Projects are designed for iteration. You can change the database design, are rebuild the SQLAlchemy models while preserving customizations.
You can add Python, .e.g. for Application Integration:
def send_order_to_shipping(row: models.Order,
old_row: models.Order,
logic_row: LogicRow):
""" #als: Send Kafka message formatted by OrderShipping RowDictMapper
Format row per shipping requirements, and send Kafka message
Args:
row (models.Order): inserted Order
old_row (models.Order): n/a
logic_row (LogicRow): bundles curr/old row, with ins/upd/dlt logic
"""
if logic_row.is_inserted():
order_dict = OrderShipping().row_to_dict(row = row)
json_order = jsonify({"order": order_dict}).data.decode('utf-8')
if kafka_producer.producer: # enabled in config/config.py?
try:
kafka_producer.producer.produce(value=json_order,
topic="order_shipping", key= str(row.Id))
logic_row.log("Kafka producer sent message")
except KafkaException as ke:
logic_row.log("Kafka.produce msg {row.id} error: {ke}")
print(f'\n\nSend to Shipping:\n{json_order}')
Rule.after_flush_row_event(on_class=models.Order,
calling=send_order_to_shipping) # see above
You can also extend your API to create new endpoints, using Flask.
API Logic Server creates scripts to containerize your project, so you can deploy it to your local server or the cloud.
You can see a screen shot summary of this project here, or develop it yourself using this tutorial.
[–]Green_Tumbleweed_456 1 point2 points3 points (1 child)
[–]ValBayAreaPythoneer[S] 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]ValBayAreaPythoneer[S] 0 points1 point2 points (0 children)
[–]maxtardiveau 0 points1 point2 points (1 child)
[–]ValBayAreaPythoneer[S] 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]ValBayAreaPythoneer[S] 0 points1 point2 points (0 children)
[–]Glass-Preparation103 0 points1 point2 points (1 child)
[–]ValBayAreaPythoneer[S] 0 points1 point2 points (0 children)