Click here to Skip to main content
14,689,405 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Problem : I have a source schema (a bunch of tables) and a destination schema (a fixed number of tables with predefined columns) in postgres. I want to design a mapper-insert function which would map the source column(s) to a particular column of a table in destination schema.

What I have tried:

Idea : I thought of constructing a JSON file which will be of {Destination_column : Source_column(s)} kind of construct. In case of multiple source column I thought of passing the as list in the value of dictionary. After that I plan to create a mapper function to define the operations on the source column and insert function to insert the value after performing the operation.


Question : Is this the correct pythonic way to solve this problem? Do we have any python libraries (petl or bonobo) which does source-destination mapping? Thoughts?
Posted
Updated 1-Feb-20 23:24pm
v3

1 solution

Take a look at Orange: etl-tools-for-programmers~orange[^]
   

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)




CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900