licenses
sequencelengths 1
3
| version
stringclasses 677
values | tree_hash
stringlengths 40
40
| path
stringclasses 1
value | type
stringclasses 2
values | size
stringlengths 2
8
| text
stringlengths 25
67.1M
| package_name
stringlengths 2
41
| repo
stringlengths 33
86
|
---|---|---|---|---|---|---|---|---|
[
"MIT"
] | 1.1.0 | b87c6fd8d4cd7342ae4c31746b9e18a4da6b827e | docs | 2031 | # The Zero ring
A number type that has only one value (zero) and needs no storage: see
<https://en.wikipedia.org/wiki/Zero_ring>.
* [GitHub](https://github.com/eschnett/ZeroRing.jl): Source code
repository
* [](https://github.com/eschnett/ZeroRing.jl/actions)
## Why would I need this?
The *zero ring* is a very special type of number: There is only one
value (zero). Booleans, for example, have two values (`false` and
`true`), `Int8` have 256 values, etc. A `ZeroElem` has only one value
(`zeroelem`). This is similar to `Nothing`, except that `ZeroElem` is
a subtype of `Number` whereas `Nothing` isn't, and it doesn't indicate
that some information is missing (as `Missing` would).
At times, you might have a data structure holding numbers, but one is
only interested in the "skeleton" of the data structure, and not into
the numbers it can hold. Examples are:
- a graph with weighted edges, but the weights are not relevant
- a sparse matrix, but only the sparsity structure is interesting
## Why can't I use `Nothing` or `Missing` instead?
`Nothing` is not a subtype of `Number`, which makes some linear
algebra operations fail. It would, of course, in principle be possible
to make `Nothing` be a subtype of `Number`, but this is not a good
idea as `Nothing` is a general concept that has nothing to do with
numbers, addition, multiplication, etc.
In a similar manner, `Missing` indicates that certain information is
missing. This is not the case here; all the necessary information is
there. It makes sense to define `missing + zeroelem`, and the result
should be `missing`.
## What is the advantage of `ZeroElem`?
A `ZeroElem` number takes no storage:
```Julia
julia> @allocated Array{ZeroElem}(undef, 1000)
80
julia> @allocated Array{Nothing}(undef, 1000)
80
julia> @allocated Array{Bool}(undef, 1000)
1088
```
The 80 bytes reported here are for the array metadata (its size and
shape etc.); there are no actual data allocated.
| ZeroRing | https://github.com/eschnett/ZeroRing.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 896 | module Mux
export mux, stack, branch
using Base64: stringmime
# This might be the smallest core ever.
mux(f) = f
mux(m, f) = x -> m(f, x)
mux(ms...) = foldr(mux, ms)
stack(m) = m
stack(m, n) = (f, x) -> m(mux(n, f), x)
stack(ms...) = foldl(stack, ms)
branch(p, t) = (f, x) -> (p(x) ? t : f)(x)
branch(p, t...) = branch(p, mux(t...))
# May as well provide a few conveniences, though.
using Hiccup
include("lazy.jl")
include("server.jl")
include("backtrace_rewriting.jl")
include("basics.jl")
include("routing.jl")
include("websockets_integration.jl")
include("examples/mimetypes.jl")
include("examples/basic.jl")
include("examples/files.jl")
const defaults = stack(todict, basiccatch, splitquery, toresponse, assetserver, pkgfiles)
const wdefaults = stack(todict, wcatch, splitquery)
const prod_defaults = stack(todict, stderrcatch, splitquery, toresponse, assetserver, pkgfiles)
end
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 2156 | """
mux_showerror(io, exc, bt)
`showerror(io, exc, bt)`, but simplify the printing of all those Mux closures.
"""
function mux_showerror(io, e, bt)
buf = IOBuffer()
showerror(buf, e, bt)
str = String(take!(buf))
write(io, rename_mux_closures(str))
end
"""
find_matching_index(str, idx, closing_char)
Find the index in `str` of the matching `closing_char` for the opening character at `idx`, or `nothing` if there is no matching character.
If there is a matching character, `str[idx:find_matching_index(str, idx, closing_char)]` will contain:
- n opening characters, where 1 ≤ n
- m closing characters, where 1 ≤ m ≤ n
The interior opening and closing characters need not be balanced.
# Examples
```
julia> find_closing_char("((()))", 1, ')')
6
julia> find_closing_char("Vector{Union{Int64, Float64}}()", 7, '}')
29
```
"""
function find_closing_char(str, idx, closing_char)
opening_char = str[idx]
open = 1
while open != 0 && idx < lastindex(str)
idx = nextind(str, idx)
char = str[idx]
if char == opening_char
open += 1
elseif char == closing_char
open -= 1
end
end
return open == 0 ? idx : nothing
end
"""
rename_mux_closures(str)
Replace all anonymous "Mux.var" closures in `str` with "Mux.Closure" to make backtraces easier to read.
"""
function rename_mux_closures(str)
maybe_idx = findfirst(r"Mux\.var\"#\w+#\w+\"{", str)
if isnothing(maybe_idx)
return str
else
start_idx, brace_idx = extrema(maybe_idx)
end
maybe_idx = find_closing_char(str, brace_idx, '}')
if !isnothing(maybe_idx)
suffix = maybe_idx == lastindex(str) ? "" : str[nextind(str, maybe_idx):end]
str = str[1:prevind(str, start_idx)] * "Mux.Closure" * suffix
rename_mux_closures(str)
else
str
end
end
"""
Closure
Mux doesn't really use this type, we just print `Mux.Closure` instead of `Mux.var"#1#2{Mux.var"#3#4"{...}}` in stacktraces to make them easier to read.
"""
struct Closure
Closure() = error("""Mux doesn't really use this type, we just print `Mux.Closure` instead of `Mux.var"#1#2{Mux.var"#3#4"{...}}` in stacktraces to make them easier to read.""")
end
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 2794 | import HTTP
import HTTP.Request
export respond
# Utils
pre(f) = (app, req) -> app(f(req))
post(f) = (app, req) -> f(app(req))
# Request
using HTTP.URIs: URI
function todict(req::Request)
req′ = Dict()
req′[:method] = req.method
req′[:headers] = req.headers
req′[:data] = req.body
req′[:uri] = URI(req.target)
req′[:cookies] = HTTP.cookies(req)
return req′
end
todict(app, req) = app(todict(req))
function splitquery(app, req)
uri = req[:uri]
req[:path] = splitpath(uri.path)
req[:query] = uri.query
app(req)
end
params!(req) = get!(req, :params, d())
# Response
Response(d::AbstractDict) =
HTTP.Response(get(d, :status, 200),
get(d, :headers, HTTP.Headers());
body = get(d, :body, ""))
Response(o) = HTTP.Response(stringmime(MIME"text/html"(), o))
response(d) = d
response(s::AbstractString) = d(:body=>s)
toresponse(app, req) = Response(response(app(req)))
respond(res) = req -> response(res)
reskey(k, v) = post(res -> merge!(res, d(k=>v)))
status(s) = reskey(:status, s)
# Error handling
mux_css = """
body { font-family: sans-serif; padding:50px; }
.box { background: #fcfcff; padding:20px; border: 1px solid #ddd; border-radius:5px; white-space: pre-wrap; word-wrap: break-word; }
pre { line-height:1.5 }
a { text-decoration:none; color:#225; }
a:hover { color:#336; }
u { cursor: pointer }
"""
error_phrases = ["Looks like someone needs to pay their developers more."
"Someone order a thousand more monkeys! And a million more typewriters!"
"Maybe it's time for some sleep?"
"Don't bother debugging this one – it's almost definitely a quantum thingy."
"It probably won't happen again though, right?"
"F5! F5! F5!"
"F5! F5! FFS!"
"On the bright side, nothing has exploded. Yet."
"If this error has frustrated you, try clicking <u>here</u>."]
function basiccatch(app, req)
try
app(req)
catch e
io = IOBuffer()
println(io, "<style>", mux_css, "</style>")
println(io, "<h1>Internal Error</h1>")
println(io, "<p>$(error_phrases[rand(1:length(error_phrases))])</p>")
println(io, "<pre class=\"box\">")
mux_showerror(io, e, catch_backtrace())
println(io, "</pre>")
return d(:status => 500, :body => codeunits(String(take!(io))))
end
end
function stderrcatch(app, req)
try
app(req)
catch e
showerror(stderr, e, catch_backtrace())
return d(:status => 500, :body => codeunits("Internal server error"))
end
end
function prettierstderrcatch(app, req)
try
app(req)
catch e
mux_showerror(stderr, e, catch_backtrace())
return d(:status => 500, :body => codeunits("Internal server error"))
end
end
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 212 | # Just the bits of Lazy.jl that we actually use.
macro errs(ex)
:(try $(esc(ex))
catch e
showerror(stderr, e, catch_backtrace())
println(stderr)
end)
end
d(xs...) = Dict{Any, Any}(xs...)
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 1955 | import HTTP
export method, GET, route, page, probability, query
# Request type
method(m::AbstractString, app...) = branch(req -> req[:method] == m, app...)
method(ms, app...) = branch(req -> req[:method] in ms, app...)
GET(app...) = method("GET", app...)
# Path routing
splitpath(p::AbstractString) = split(p, "/", keepempty=false)
splitpath(p) = p
function matchpath(target, path)
length(target) > length(path) && return
params = d()
for i = 1:length(target)
if startswith(target[i], ":")
params[Symbol(target[i][2:end])] = path[i]
else
target[i] == path[i] || return
end
end
return params
end
function matchpath!(target, req)
ps = matchpath(target, req[:path])
ps === nothing && return false
merge!(params!(req), ps)
splice!(req[:path], 1:length(target))
return true
end
route(p::Vector, app...) = branch(req -> matchpath!(p, req), app...)
route(p::AbstractString, app...) = route(splitpath(p), app...)
route(app...) = route([], app...)
route(app::Base.Callable, p) = route(p, app)
route(app1::Base.Callable, app2::Base.Callable) = route([], app1, app2)
page(p::Vector, app...) = branch(req -> length(p) == length(req[:path]) && matchpath!(p, req), app...)
page(p::AbstractString, app...) = page(splitpath(p), app...)
page(app...) = page([], app...)
page(app::Base.Callable, p) = page(p, app)
page(app1::Base.Callable, app2::Base.Callable) = page([], app1, app2)
# Query routing
function matchquery(q, req)
qdict = HTTP.URIs.queryparams(req[:query])
length(q) != length(qdict) && return false
for (key, value) in q
if haskey(qdict, key) && (value == "" || value == qdict[key])
continue
else
return false
end
end
return true
end
query(q::Dict{<:AbstractString, <:AbstractString}, app...) =
branch(req -> matchquery(q, req), app...)
# Misc
probability(x, app...) = branch(_->rand()<x, app...)
# Old typo
@deprecate probabilty(x, app...) probability(x, app...)
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 2465 | using Sockets
import Base.Meta.isexpr
import HTTP: WebSockets
export @app, serve
# `App` is just a box which allows the server to be
# redefined on the fly.
# In general these methods provide a simple way to
# get up and running, but aren't meant to be comprehensive.
mutable struct App
warez
end
macro app(def)
@assert isexpr(def, :(=))
name, warez = def.args
warez = isexpr(warez, :tuple) ? Expr(:call, :mux, map(esc, warez.args)...) : esc(warez)
quote
if $(Expr(:isdefined, esc(name)))
$(esc(name)).warez = $warez
else
$(esc(name)) = App($warez)
end
nothing
end
end
# conversion functions for known http_handler return objects
mk_response(d) = d
function mk_response(d::Dict)
r = HTTP.Response(get(d, :status, 200))
haskey(d, :body) && (r.body = d[:body])
haskey(d, :headers) && (r.headers = d[:headers])
return r
end
function http_handler(app::App)
handler = (req) -> mk_response(app.warez(req))
# handler.events["error"] = (client, error) -> println(error)
# handler.events["listen"] = (port) -> println("Listening on $port...")
return handler
end
function ws_handler(app::App)
handler = (sock) -> mk_response(app.warez(sock))
return handler
end
const default_port = 8000
const localhost = ip"0.0.0.0"
"""
serve(h::App, host=$localhost, port=$default_port; kws...)
serve(h::App, port::Int; kws...)
Serve the app `h` at the specified `host` and `port`. Keyword arguments are
passed to `HTTP.serve`.
Starts an async `Task`. Call `wait(serve(...))` in scripts where you want Julia
to wait until the server is terminated.
"""
function serve(h::App, host = localhost, port = default_port; kws...)
@errs HTTP.serve!(http_handler(h), host, port; kws...)
end
serve(h::App, port::Integer; kws...) = serve(h, localhost, port; kws...)
"""
serve(h::App, w::App, host=$localhost, port=$default_port; kwargs...)
serve(h::App, w::App, port::Integer; kwargs...)
Start a server that uses `h` to serve regular HTTP requests and `w` to serve
WebSocket requests.
"""
function serve(h::App, w::App, host = localhost, port = default_port; kws...)
server = HTTP.listen!(host, port; kws...) do http
if HTTP.WebSockets.isupgrade(http.message)
HTTP.WebSockets.upgrade(ws_handler(w), http)
else
HTTP.streamhandler(http_handler(h))(http)
end
end
return server
end
serve(h::App, w::App, port::Integer; kwargs...) = serve(h, w, localhost, port; kwargs...)
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 446 | using HTTP.WebSockets: WebSocket
function todict(sock::WebSocket)
req′ = todict(sock.request)
req′[:socket] = sock
return req′
end
function wcatch(app, req)
try
app(req)
catch e
println(stderr, "Error handling websocket connection:")
showerror(stderr, e, catch_backtrace())
end
end
function wclose(_, req)
close(req[:socket])
end
function echo(req)
sock = req[:socket]
for msg in sock
send(sock, msg)
end
end
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 75 | # Page not found
notfound(s = "Not found") = mux(status(404), respond(s))
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 3664 | using Hiccup, Pkg
import Hiccup.div
export files
Base.joinpath() = ""
function validpath(root, path; dirs = true)
full = normpath(root, path)
startswith(full, root) &&
(isfile(full) || (dirs && isdir(full)))
end
extension(f) = last(splitext(f))[2:end]
fileheaders(f) = d("Content-Type" => get(mimetypes, extension(f), "application/octet-stream"))
fileresponse(f) = d(:file => f,
:body => read(f),
:headers => fileheaders(f))
fresp(f) =
isfile(f) ? fileresponse(f) :
isdir(f) ? dirresponse(f) :
error("$f doesn't exist")
"""
files(root, dirs=true)
Middleware to serve files in the directory specified by the absolute path `root`.
`req[:path]` will be combined with `root` to yield a filepath.
If the filepath is contained within `root` (after normalisation) and refers to an existing file (or directory if `dirs=true`), then respond with the file (or a directory listing), otherwise call the next middleware.
If you'd like to specify a `root` relative to your current working directory
or to the directory containing the file that your server is defined in, then
you can use `pwd()` or `@__DIR__`, and (if you need them) `joinpath` or `normpath`.
# Examples
```
files(pwd()) # serve files from the current working directory
files(@__DIR__) # serve files from the directory the script is in
# serve files from the assets directory in the same directory the script is in:
files(joinpath(@__DIR__, "assets"))
# serve files from the assets directory in the directory above the directory the script is in:
files(normpath(@__DIR__, "../assets))
```
"""
function files(root, dirs = true)
branch(req -> validpath(root, joinpath(req[:path]...), dirs=dirs),
req -> fresp(joinpath(root, req[:path]...)))
end
# Directories
files_css = """
table { width:100%; border-radius:5px; }
td { padding: 5px; }
tr:nth-child(odd) { background: #f4f4ff; }
.size { text-align: right; }
"""
function filelink(root, f)
isdir(joinpath(root, f)) && (f = "$f/")
a(d(:href=>f), f)
end
dirresponse(f) =
html(head(style([mux_css, files_css])),
body(h1("Files"),
div(".box", table([tr(td(".file", filelink(f, x)),
td(".size", string(filesize(joinpath(f, x)))))
for x in ["..", readdir(f)...]]))))
const ASSETS_DIR = "assets"
function packagefiles(dirs=true)
loadpaths = LOAD_PATH
function absdir(req)
pkg = req[:params][:pkg]
for p in loadpaths
dir = joinpath(p, pkg, ASSETS_DIR)
if isdir(dir)
return dir
end
end
Pkg.dir(String(pkg), ASSETS_DIR) # Pkg.dir doesn't take SubString
end
branch(req -> validpath(absdir(req), joinpath(req[:path]...), dirs=dirs),
req -> (Base.@warn("""
Relying on /pkg/ is now deprecated. Please use the package
`AssetRegistry.jl` instead to register assets directory
""", maxlog=1);
fresp(joinpath(absdir(req), req[:path]...))))
end
const pkgfiles = route("pkg/:pkg", packagefiles(), Mux.notfound())
using AssetRegistry
function assetserve(dirs=true)
absdir(req) = AssetRegistry.registry["/assetserver/" * HTTP.unescapeuri(req[:params][:key])]
path(req) = HTTP.unescapeuri.(req[:path])
branch(req -> (isfile(absdir(req)) && isempty(req[:path])) ||
validpath(absdir(req), joinpath(path(req)...), dirs=dirs),
req -> fresp(joinpath(absdir(req), path(req)...)))
end
const assetserver = route("assetserver/:key", assetserve(), Mux.notfound())
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 37712 | # Note: this file was originally part of https://github.com/JuliaWeb/HttpServer.jl
# released under the following license:
#
# The MIT License (MIT)
#
# Copyright (c) 2013 Daniel Espeset, Zach Allaun, Leah Hanson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
const mimetypes = Dict{AbstractString,AbstractString}([
("semd", "application/vnd.semd"),
("mrc", "application/marc"),
("asc", "application/pgp-signature"),
("fe_launch", "application/vnd.denovo.fcselayout-link"),
("esa", "application/vnd.osgi.subsystem"),
("sub", "text/vnd.dvb.subtitle"),
("clkx", "application/vnd.crick.clicker"),
("xdw", "application/vnd.fujixerox.docuworks"),
("nsc", "application/x-conference"),
("lostxml", "application/lost+xml"),
("rsd", "application/rsd+xml"),
("p8", "application/pkcs8"),
("dra", "audio/vnd.dra"),
("x3db", "model/x3d+binary"),
("text", "text/plain"),
("vox", "application/x-authorware-bin"),
("bmi", "application/vnd.bmi"),
("ma", "application/mathematica"),
("xbd", "application/vnd.fujixerox.docuworks.binder"),
("mgz", "application/vnd.proteus.magazine"),
("pgm", "image/x-portable-graymap"),
("xpi", "application/x-xpinstall"),
("htm", "text/html"),
("lasxml", "application/vnd.las.las+xml"),
("ait", "application/vnd.dvb.ait"),
("abw", "application/x-abiword"),
("wm", "video/x-ms-wm"),
("sdkd", "application/vnd.solent.sdkm+xml"),
("swi", "application/vnd.aristanetworks.swi"),
("iif", "application/vnd.shana.informed.interchange"),
("bpk", "application/octet-stream"),
("gxt", "application/vnd.geonext"),
("wpl", "application/vnd.ms-wpl"),
("sxd", "application/vnd.sun.xml.draw"),
("ktx", "image/ktx"),
("acc", "application/vnd.americandynamics.acc"),
("mxml", "application/xv+xml"),
("setpay", "application/set-payment-initiation"),
("aac", "audio/x-aac"),
("vsw", "application/vnd.visio"),
("uris", "text/uri-list"),
("mp4", "video/mp4"),
("st", "application/vnd.sailingtracker.track"),
("3ds", "image/x-3ds"),
("sgm", "text/sgml"),
("xwd", "image/x-xwindowdump"),
("slt", "application/vnd.epson.salt"),
("pya", "audio/vnd.ms-playready.media.pya"),
("dir", "application/x-director"),
("utz", "application/vnd.uiq.theme"),
("z6", "application/x-zmachine"),
("z4", "application/x-zmachine"),
("xvm", "application/xv+xml"),
("pki", "application/pkixcmp"),
("vor", "application/vnd.stardivision.writer"),
("vxml", "application/voicexml+xml"),
("plb", "application/vnd.3gpp.pic-bw-large"),
("tar", "application/x-tar"),
("dmp", "application/vnd.tcpdump.pcap"),
("gmx", "application/vnd.gmx"),
("spf", "application/vnd.yamaha.smaf-phrase"),
("susp", "application/vnd.sus-calendar"),
("xlam", "application/vnd.ms-excel.addin.macroenabled.12"),
("odf", "application/vnd.oasis.opendocument.formula"),
("wspolicy", "application/wspolicy+xml"),
("gml", "application/gml+xml"),
("msl", "application/vnd.mobius.msl"),
("mseq", "application/vnd.mseq"),
("rmvb", "application/vnd.rn-realmedia-vbr"),
("saf", "application/vnd.yamaha.smaf-audio"),
("fh7", "image/x-freehand"),
("uvvu", "video/vnd.uvvu.mp4"),
("mpkg", "application/vnd.apple.installer+xml"),
("shf", "application/shf+xml"),
("org", "application/vnd.lotus-organizer"),
("cpp", "text/x-c"),
("deploy", "application/octet-stream"),
("uvu", "video/vnd.uvvu.mp4"),
("z3", "application/x-zmachine"),
("application", "application/x-ms-application"),
("wps", "application/vnd.ms-works"),
("dtd", "application/xml-dtd"),
("cww", "application/prs.cww"),
("cmdf", "chemical/x-cmdf"),
("mpe", "video/mpeg"),
("sv4crc", "application/x-sv4crc"),
("appcache", "text/cache-manifest"),
("in", "text/plain"),
("png", "image/png"),
("wmv", "video/x-ms-wmv"),
("aep", "application/vnd.audiograph"),
("gim", "application/vnd.groove-identity-message"),
("sdd", "application/vnd.stardivision.impress"),
("ccxml", "application/ccxml+xml"),
("sh", "application/x-sh"),
("pptm", "application/vnd.ms-powerpoint.presentation.macroenabled.12"),
("sdkm", "application/vnd.solent.sdkm+xml"),
("ulx", "application/x-glulx"),
("xbap", "application/x-ms-xbap"),
("xlsm", "application/vnd.ms-excel.sheet.macroenabled.12"),
("n-gage", "application/vnd.nokia.n-gage.symbian.install"),
("stc", "application/vnd.sun.xml.calc.template"),
("icc", "application/vnd.iccprofile"),
("f4v", "video/x-f4v"),
("mmr", "image/vnd.fujixerox.edmics-mmr"),
("eml", "message/rfc822"),
("rgb", "image/x-rgb"),
("m3a", "audio/mpeg"),
("cdbcmsg", "application/vnd.contact.cmsg"),
("jlt", "application/vnd.hp-jlyt"),
("xar", "application/vnd.xara"),
("icm", "application/vnd.iccprofile"),
("etx", "text/x-setext"),
("iges", "model/iges"),
("clkw", "application/vnd.crick.clicker.wordbank"),
("uvva", "audio/vnd.dece.audio"),
("pcurl", "application/vnd.curl.pcurl"),
("rq", "application/sparql-query"),
("azf", "application/vnd.airzip.filesecure.azf"),
("gram", "application/srgs"),
("jad", "text/vnd.sun.j2me.app-descriptor"),
("mmf", "application/vnd.smaf"),
("c4u", "application/vnd.clonk.c4group"),
("mp4s", "application/mp4"),
("jsonml", "application/jsonml+json"),
("itp", "application/vnd.shana.informed.formtemplate"),
("nc", "application/x-netcdf"),
("qwd", "application/vnd.quark.quarkxpress"),
("spp", "application/scvp-vp-response"),
("pqa", "application/vnd.palm"),
("uvv", "video/vnd.dece.video"),
("pub", "application/x-mspublisher"),
("gph", "application/vnd.flographit"),
("rmp", "audio/x-pn-realaudio-plugin"),
("ott", "application/vnd.oasis.opendocument.text-template"),
("cdkey", "application/vnd.mediastation.cdkey"),
("wqd", "application/vnd.wqd"),
("gqf", "application/vnd.grafeq"),
("sv4cpio", "application/x-sv4cpio"),
("skm", "application/vnd.koan"),
("swa", "application/x-director"),
("html", "text/html"),
("uvz", "application/vnd.dece.zip"),
("p7s", "application/pkcs7-signature"),
("uvvs", "video/vnd.dece.sd"),
("p7r", "application/x-pkcs7-certreqresp"),
("wrl", "model/vrml"),
("f77", "text/x-fortran"),
("uvvd", "application/vnd.dece.data"),
("crt", "application/x-x509-ca-cert"),
("ppt", "application/vnd.ms-powerpoint"),
("smzip", "application/vnd.stepmania.package"),
("osf", "application/vnd.yamaha.openscoreformat"),
("c11amc", "application/vnd.cluetrust.cartomobile-config"),
("m4u", "video/vnd.mpegurl"),
("mpt", "application/vnd.ms-project"),
("plf", "application/vnd.pocketlearn"),
("cbt", "application/x-cbr"),
("mseed", "application/vnd.fdsn.mseed"),
("ecma", "application/ecmascript"),
("srx", "application/sparql-results+xml"),
("cxt", "application/x-director"),
("mwf", "application/vnd.mfer"),
("pkg", "application/octet-stream"),
("ami", "application/vnd.amiga.ami"),
("mvb", "application/x-msmediaview"),
("xdssc", "application/dssc+xml"),
("rss", "application/rss+xml"),
("s", "text/x-asm"),
("odp", "application/vnd.oasis.opendocument.presentation"),
("mng", "video/x-mng"),
("lvp", "audio/vnd.lucent.voice"),
("mj2", "video/mj2"),
("uvp", "video/vnd.dece.pd"),
("dts", "audio/vnd.dts"),
("tga", "image/x-tga"),
("h263", "video/h263"),
("mpp", "application/vnd.ms-project"),
("xvml", "application/xv+xml"),
("mdi", "image/vnd.ms-modi"),
("fnc", "application/vnd.frogans.fnc"),
("json", "application/json"),
("otp", "application/vnd.oasis.opendocument.presentation-template"),
("m4v", "video/x-m4v"),
("oti", "application/vnd.oasis.opendocument.image-template"),
("ps", "application/postscript"),
("kpxx", "application/vnd.ds-keypoint"),
("m13", "application/x-msmediaview"),
("torrent", "application/x-bittorrent"),
("shar", "application/x-shar"),
("acutc", "application/vnd.acucorp"),
("smil", "application/smil+xml"),
("wcm", "application/vnd.ms-works"),
("uvvt", "application/vnd.dece.ttml+xml"),
("xop", "application/xop+xml"),
("knp", "application/vnd.kinar"),
("cpio", "application/x-cpio"),
("mc1", "application/vnd.medcalcdata"),
("svg", "image/svg+xml"),
("blb", "application/x-blorb"),
("u32", "application/x-authorware-bin"),
("hlp", "application/winhlp"),
("jpgv", "video/jpeg"),
("onetmp", "application/onenote"),
("flac", "audio/x-flac"),
("ssdl", "application/ssdl+xml"),
("sfs", "application/vnd.spotfire.sfs"),
("sdc", "application/vnd.stardivision.calc"),
("dotx", "application/vnd.openxmlformats-officedocument.wordprocessingml.template"),
("uvvx", "application/vnd.dece.unspecified"),
("uvvp", "video/vnd.dece.pd"),
("atc", "application/vnd.acucorp"),
("cfs", "application/x-cfs-compressed"),
("mets", "application/mets+xml"),
("wav", "audio/x-wav"),
("mods", "application/mods+xml"),
("thmx", "application/vnd.ms-officetheme"),
("mbk", "application/vnd.mobius.mbk"),
("cpt", "application/mac-compactpro"),
("fzs", "application/vnd.fuzzysheet"),
("mrcx", "application/marcxml+xml"),
("mlp", "application/vnd.dolby.mlp"),
("emma", "application/emma+xml"),
("ecelp9600", "audio/vnd.nuera.ecelp9600"),
("dvi", "application/x-dvi"),
("lrm", "application/vnd.ms-lrm"),
("mif", "application/vnd.mif"),
("mcd", "application/vnd.mcd"),
("cxx", "text/x-c"),
("dist", "application/octet-stream"),
("pml", "application/vnd.ctc-posml"),
("t3", "application/x-t3vm-image"),
("ots", "application/vnd.oasis.opendocument.spreadsheet-template"),
("umj", "application/vnd.umajin"),
("sit", "application/x-stuffit"),
("pptx", "application/vnd.openxmlformats-officedocument.presentationml.presentation"),
("css", "text/css"),
("g2w", "application/vnd.geoplan"),
("spx", "audio/ogg"),
("eva", "application/x-eva"),
("et3", "application/vnd.eszigno3+xml"),
("dart", "application/vnd.dart"),
("3gp", "video/3gpp"),
("kne", "application/vnd.kinar"),
("rnc", "application/relax-ng-compact-syntax"),
("m14", "application/x-msmediaview"),
("urls", "text/uri-list"),
("vst", "application/vnd.visio"),
("xslt", "application/xslt+xml"),
("xspf", "application/xspf+xml"),
("g3w", "application/vnd.geospace"),
("mxs", "application/vnd.triscape.mxs"),
("les", "application/vnd.hhe.lesson-player"),
("txf", "application/vnd.mobius.txf"),
("f", "text/x-fortran"),
("ief", "image/ief"),
("dxp", "application/vnd.spotfire.dxp"),
("tsd", "application/timestamped-data"),
("bed", "application/vnd.realvnc.bed"),
("azs", "application/vnd.airzip.filesecure.azs"),
("pfr", "application/font-tdpfr"),
("seed", "application/vnd.fdsn.seed"),
("mobi", "application/x-mobipocket-ebook"),
("dsc", "text/prs.lines.tag"),
("man", "text/troff"),
("deb", "application/x-debian-package"),
("asx", "video/x-ms-asf"),
("xlsb", "application/vnd.ms-excel.sheet.binary.macroenabled.12"),
("cb7", "application/x-cbr"),
("weba", "audio/webm"),
("fdf", "application/vnd.fdf"),
("exi", "application/exi"),
("otg", "application/vnd.oasis.opendocument.graphics-template"),
("list", "text/plain"),
("mus", "application/vnd.musician"),
("cod", "application/vnd.rim.cod"),
("qxb", "application/vnd.quark.quarkxpress"),
("pot", "application/vnd.ms-powerpoint"),
("mads", "application/mads+xml"),
("cab", "application/vnd.ms-cab-compressed"),
("fh", "image/x-freehand"),
("xlm", "application/vnd.ms-excel"),
("ogx", "application/ogg"),
("pkipath", "application/pkix-pkipath"),
("xaml", "application/xaml+xml"),
("vtu", "model/vnd.vtu"),
("fgd", "application/x-director"),
("ngdat", "application/vnd.nokia.n-gage.data"),
("scm", "application/vnd.lotus-screencam"),
("txd", "application/vnd.genomatix.tuxedo"),
("ptid", "application/vnd.pvi.ptid1"),
("nnd", "application/vnd.noblenet-directory"),
("nml", "application/vnd.enliven"),
("x3d", "model/x3d+xml"),
("cc", "text/x-c"),
("mb", "application/mathematica"),
("ahead", "application/vnd.ahead.space"),
("sisx", "application/vnd.symbian.install"),
("pas", "text/x-pascal"),
("otf", "application/x-font-otf"),
("tr", "text/troff"),
("cdmia", "application/cdmi-capability"),
("joda", "application/vnd.joost.joda-archive"),
("tmo", "application/vnd.tmobile-livetv"),
("dll", "application/x-msdownload"),
("mesh", "model/mesh"),
("geo", "application/vnd.dynageo"),
("m3u8", "application/vnd.apple.mpegurl"),
("p12", "application/x-pkcs12"),
("acu", "application/vnd.acucobol"),
("bat", "application/x-msdownload"),
("wsdl", "application/wsdl+xml"),
("cmp", "application/vnd.yellowriver-custom-menu"),
("odft", "application/vnd.oasis.opendocument.formula-template"),
("smv", "video/x-smv"),
("dae", "model/vnd.collada+xml"),
("efif", "application/vnd.picsel"),
("aso", "application/vnd.accpac.simply.aso"),
("cdf", "application/x-netcdf"),
("gv", "text/vnd.graphviz"),
("xls", "application/vnd.ms-excel"),
("bdf", "application/x-font-bdf"),
("gbr", "application/rpki-ghostbusters"),
("evy", "application/x-envoy"),
("stf", "application/vnd.wt.stf"),
("onetoc", "application/onenote"),
("oa2", "application/vnd.fujitsu.oasys2"),
("jpeg", "image/jpeg"),
("djv", "image/vnd.djvu"),
("aifc", "audio/x-aiff"),
("docm", "application/vnd.ms-word.document.macroenabled.12"),
("stl", "application/vnd.ms-pki.stl"),
("uvg", "image/vnd.dece.graphic"),
("m21", "application/mp21"),
("stw", "application/vnd.sun.xml.writer.template"),
("p7b", "application/x-pkcs7-certificates"),
("ai", "application/postscript"),
("wmz", "application/x-msmetafile"),
("pbm", "image/x-portable-bitmap"),
("cbz", "application/x-cbr"),
("mks", "video/x-matroska"),
("mmd", "application/vnd.chipnuts.karaoke-mmd"),
("p", "text/x-pascal"),
("sitx", "application/x-stuffitx"),
("xenc", "application/xenc+xml"),
("jpg", "image/jpeg"),
("clkp", "application/vnd.crick.clicker.palette"),
("uvvg", "image/vnd.dece.graphic"),
("cdx", "chemical/x-cdx"),
("wad", "application/x-doom"),
("yin", "application/yin+xml"),
("emz", "application/x-msmetafile"),
("stk", "application/hyperstudio"),
("nlu", "application/vnd.neurolanguage.nlu"),
("sig", "application/pgp-signature"),
("opf", "application/oebps-package+xml"),
("box", "application/vnd.previewsystems.box"),
("log", "text/plain"),
("dxr", "application/x-director"),
("igm", "application/vnd.insors.igm"),
("mag", "application/vnd.ecowin.chart"),
("rep", "application/vnd.businessobjects"),
("jpm", "video/jpm"),
("std", "application/vnd.sun.xml.draw.template"),
("clp", "application/x-msclip"),
("cmx", "image/x-cmx"),
("gif", "image/gif"),
("ods", "application/vnd.oasis.opendocument.spreadsheet"),
("z7", "application/x-zmachine"),
("ssml", "application/ssml+xml"),
("mfm", "application/vnd.mfmp"),
("teacher", "application/vnd.smart.teacher"),
("cat", "application/vnd.ms-pki.seccat"),
("z5", "application/x-zmachine"),
("kwt", "application/vnd.kde.kword"),
("uvvh", "video/vnd.dece.hd"),
("ei6", "application/vnd.pg.osasli"),
("vcard", "text/vcard"),
("ppm", "image/x-portable-pixmap"),
("rtx", "text/richtext"),
("opml", "text/x-opml"),
("epub", "application/epub+zip"),
("uvf", "application/vnd.dece.data"),
("ksp", "application/vnd.kde.kspread"),
("semf", "application/vnd.semf"),
("sc", "application/vnd.ibm.secure-container"),
("mpeg", "video/mpeg"),
("qxt", "application/vnd.quark.quarkxpress"),
("xif", "image/vnd.xiff"),
("js", "application/javascript"),
("ttl", "text/turtle"),
("sti", "application/vnd.sun.xml.impress.template"),
("sus", "application/vnd.sus-calendar"),
("odb", "application/vnd.oasis.opendocument.database"),
("qt", "video/quicktime"),
("gca", "application/x-gca-compressed"),
("g3", "image/g3fax"),
("sid", "image/x-mrsid-image"),
("ez2", "application/vnd.ezpix-album"),
("xltm", "application/vnd.ms-excel.template.macroenabled.12"),
("chm", "application/vnd.ms-htmlhelp"),
("pyv", "video/vnd.ms-playready.media.pyv"),
("kwd", "application/vnd.kde.kword"),
("mid", "audio/midi"),
("class", "application/java-vm"),
("eol", "audio/vnd.digital-winds"),
("uvh", "video/vnd.dece.hd"),
("dd2", "application/vnd.oma.dd2+xml"),
("dp", "application/vnd.osgi.dp"),
("mp2", "audio/mpeg"),
("zirz", "application/vnd.zul"),
("ez", "application/andrew-inset"),
("sldm", "application/vnd.ms-powerpoint.slide.macroenabled.12"),
("roff", "text/troff"),
("pnm", "image/x-portable-anymap"),
("crd", "application/x-mscardfile"),
("sxg", "application/vnd.sun.xml.writer.global"),
("sgml", "text/sgml"),
("uoml", "application/vnd.uoml+xml"),
("uvi", "image/vnd.dece.graphic"),
("daf", "application/vnd.mobius.daf"),
("imp", "application/vnd.accpac.simply.imp"),
("csp", "application/vnd.commonspace"),
("t", "text/troff"),
("dbk", "application/docbook+xml"),
("c4f", "application/vnd.clonk.c4group"),
("xlsx", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"),
("dgc", "application/x-dgc-compressed"),
("aab", "application/x-authorware-bin"),
("svgz", "image/svg+xml"),
("ras", "image/x-cmu-raster"),
("ntf", "application/vnd.nitf"),
("omdoc", "application/omdoc+xml"),
("dtb", "application/x-dtbook+xml"),
("odg", "application/vnd.oasis.opendocument.graphics"),
("qxd", "application/vnd.quark.quarkxpress"),
("mny", "application/x-msmoney"),
("uvvf", "application/vnd.dece.data"),
("musicxml", "application/vnd.recordare.musicxml+xml"),
("yang", "application/yang"),
("chrt", "application/vnd.kde.kchart"),
("scd", "application/x-msschedule"),
("ecelp4800", "audio/vnd.nuera.ecelp4800"),
("kmz", "application/vnd.google-earth.kmz"),
("cdxml", "application/vnd.chemdraw+xml"),
("igl", "application/vnd.igloader"),
("ufd", "application/vnd.ufdl"),
("pbd", "application/vnd.powerbuilder6"),
("dcurl", "text/vnd.curl.dcurl"),
("sxc", "application/vnd.sun.xml.calc"),
("mp3", "audio/mpeg"),
("scq", "application/scvp-cv-request"),
("igx", "application/vnd.micrografx.igx"),
("onetoc2", "application/onenote"),
("bmp", "image/bmp"),
("ssf", "application/vnd.epson.ssf"),
("tif", "image/tiff"),
("dataless", "application/vnd.fdsn.seed"),
("cmc", "application/vnd.cosmocaller"),
("qbo", "application/vnd.intu.qbo"),
("vcf", "text/x-vcard"),
("str", "application/vnd.pg.format"),
("c", "text/x-c"),
("flx", "text/vnd.fmi.flexstor"),
("gac", "application/vnd.groove-account"),
("gxf", "application/gxf"),
("zaz", "application/vnd.zzazz.deck+xml"),
("h261", "video/h261"),
("7z", "application/x-7z-compressed"),
("dwg", "image/vnd.dwg"),
("x32", "application/x-authorware-bin"),
("ttc", "application/x-font-ttf"),
("eot", "application/vnd.ms-fontobject"),
("wdb", "application/vnd.ms-works"),
("atomcat", "application/atomcat+xml"),
("ddd", "application/vnd.fujixerox.ddd"),
("ipfix", "application/ipfix"),
("potm", "application/vnd.ms-powerpoint.template.macroenabled.12"),
("dic", "text/x-c"),
("xul", "application/vnd.mozilla.xul+xml"),
("maker", "application/vnd.framemaker"),
("x3dv", "model/x3d+vrml"),
("rtf", "application/rtf"),
("nzb", "application/x-nzb"),
("fcs", "application/vnd.isac.fcs"),
("gre", "application/vnd.geometry-explorer"),
("woff", "application/font-woff"),
("fh4", "image/x-freehand"),
("obj", "application/x-tgif"),
("aif", "audio/x-aiff"),
("mxl", "application/vnd.recordare.musicxml"),
("jpgm", "video/jpm"),
("dxf", "image/vnd.dxf"),
("car", "application/vnd.curl.car"),
("xpl", "application/xproc+xml"),
("tei", "application/tei+xml"),
("paw", "application/vnd.pawaafile"),
("plc", "application/vnd.mobius.plc"),
("mqy", "application/vnd.mobius.mqy"),
("m2v", "video/mpeg"),
("tex", "application/x-tex"),
("crl", "application/pkix-crl"),
("xap", "application/x-silverlight-app"),
("fvt", "video/vnd.fvt"),
("mk3d", "video/x-matroska"),
("wax", "audio/x-ms-wax"),
("htke", "application/vnd.kenameaapp"),
("cst", "application/x-director"),
("i2g", "application/vnd.intergeo"),
("webm", "video/webm"),
("lwp", "application/vnd.lotus-wordpro"),
("aiff", "audio/x-aiff"),
("ink", "application/inkml+xml"),
("eps", "application/postscript"),
("clkt", "application/vnd.crick.clicker.template"),
("p10", "application/pkcs10"),
("csml", "chemical/x-csml"),
("qxl", "application/vnd.quark.quarkxpress"),
("vcd", "application/x-cdlink"),
("pvb", "application/vnd.3gpp.pic-bw-var"),
("smf", "application/vnd.stardivision.math"),
("ivp", "application/vnd.immervision-ivp"),
("listafp", "application/vnd.ibm.modcap"),
("tpl", "application/vnd.groove-tool-template"),
("gsf", "application/x-font-ghostscript"),
("rlc", "image/vnd.fujixerox.edmics-rlc"),
("unityweb", "application/vnd.unity"),
("fxpl", "application/vnd.adobe.fxp"),
("wri", "application/x-mswrite"),
("rp9", "application/vnd.cloanto.rp9"),
("so", "application/octet-stream"),
("pfb", "application/x-font-type1"),
("tcl", "application/x-tcl"),
("scurl", "text/vnd.curl.scurl"),
("vsd", "application/vnd.visio"),
("rmi", "audio/midi"),
("rdz", "application/vnd.data-vision.rdz"),
("frame", "application/vnd.framemaker"),
("mxf", "application/mxf"),
("tfi", "application/thraud+xml"),
("kon", "application/vnd.kde.kontour"),
("ifm", "application/vnd.shana.informed.formdata"),
("xbm", "image/x-xbitmap"),
("pre", "application/vnd.lotus-freelance"),
("c4d", "application/vnd.clonk.c4group"),
("ipk", "application/vnd.shana.informed.package"),
("ktr", "application/vnd.kahootz"),
("z1", "application/x-zmachine"),
("kia", "application/vnd.kidspiration"),
("hal", "application/vnd.hal+xml"),
("es3", "application/vnd.eszigno3+xml"),
("xht", "application/xhtml+xml"),
("uva", "audio/vnd.dece.audio"),
("exe", "application/x-msdownload"),
("cap", "application/vnd.tcpdump.pcap"),
("pfm", "application/x-font-type1"),
("xps", "application/vnd.ms-xpsdocument"),
("clkk", "application/vnd.crick.clicker.keyboard"),
("tra", "application/vnd.trueapp"),
("vis", "application/vnd.visionary"),
("osfpvg", "application/vnd.yamaha.openscoreformat.osfpvg+xml"),
("pclxl", "application/vnd.hp-pclxl"),
("oxt", "application/vnd.openofficeorg.extension"),
("xlc", "application/vnd.ms-excel"),
("vcx", "application/vnd.vcx"),
("ace", "application/x-ace-compressed"),
("wtb", "application/vnd.webturbo"),
("cct", "application/x-director"),
("scs", "application/scvp-cv-response"),
("sxm", "application/vnd.sun.xml.math"),
("mie", "application/x-mie"),
("tsv", "text/tab-separated-values"),
("f90", "text/x-fortran"),
("fti", "application/vnd.anser-web-funds-transfer-initiation"),
("c11amz", "application/vnd.cluetrust.cartomobile-config-pkg"),
("sil", "audio/silk"),
("vob", "video/x-ms-vob"),
("gnumeric", "application/x-gnumeric"),
("xdf", "application/xcap-diff+xml"),
("gpx", "application/gpx+xml"),
("fst", "image/vnd.fst"),
("srt", "application/x-subrip"),
("lrf", "application/octet-stream"),
("chat", "application/x-chat"),
("air", "application/vnd.adobe.air-application-installer-package+zip"),
("psf", "application/x-font-linux-psf"),
("hvp", "application/vnd.yamaha.hv-voice"),
("src", "application/x-wais-source"),
("fxp", "application/vnd.adobe.fxp"),
("ecelp7470", "audio/vnd.nuera.ecelp7470"),
("cii", "application/vnd.anser-web-certificate-issue-initiation"),
("oxps", "application/oxps"),
("cdmio", "application/cdmi-object"),
("atx", "application/vnd.antix.game-component"),
("wdp", "image/vnd.ms-photo"),
("pic", "image/x-pict"),
("qfx", "application/vnd.intu.qfx"),
("mar", "application/octet-stream"),
("3dml", "text/vnd.in3d.3dml"),
("gtm", "application/vnd.groove-tool-message"),
("wmd", "application/x-ms-wmd"),
("xz", "application/x-xz"),
("der", "application/x-x509-ca-cert"),
("me", "text/troff"),
("dssc", "application/dssc+der"),
("smi", "application/smil+xml"),
("nb", "application/mathematica"),
("pdf", "application/pdf"),
("atomsvc", "application/atomsvc+xml"),
("mpc", "application/vnd.mophun.certificate"),
("dotm", "application/vnd.ms-word.template.macroenabled.12"),
("mp4v", "video/mp4"),
("hvd", "application/vnd.yamaha.hv-dic"),
("uu", "text/x-uuencode"),
("afm", "application/x-font-type1"),
("p7c", "application/pkcs7-mime"),
("spq", "application/scvp-vp-request"),
("tpt", "application/vnd.trid.tpt"),
("twds", "application/vnd.simtech-mindmapper"),
("vcs", "text/x-vcalendar"),
("jisp", "application/vnd.jisp"),
("mathml", "application/mathml+xml"),
("sxw", "application/vnd.sun.xml.writer"),
("x3dbz", "model/x3d+binary"),
("x3dz", "model/x3d+xml"),
("edm", "application/vnd.novadigm.edm"),
("for", "text/x-fortran"),
("wbxml", "application/vnd.wap.wbxml"),
("pskcxml", "application/pskc+xml"),
("gtw", "model/vnd.gtw"),
("dcr", "application/x-director"),
("ogv", "video/ogg"),
("wmlsc", "application/vnd.wap.wmlscriptc"),
("potx", "application/vnd.openxmlformats-officedocument.presentationml.template"),
("mka", "audio/x-matroska"),
("adp", "audio/adpcm"),
("dna", "application/vnd.dna"),
("mp4a", "audio/mp4"),
("oas", "application/vnd.fujitsu.oasys"),
("fcdt", "application/vnd.adobe.formscentral.fcdt"),
("rms", "application/vnd.jcp.javame.midlet-rms"),
("pcap", "application/vnd.tcpdump.pcap"),
("rm", "application/vnd.rn-realmedia"),
("csv", "text/csv"),
("123", "application/vnd.lotus-1-2-3"),
("rcprofile", "application/vnd.ipunplugged.rcprofile"),
("msty", "application/vnd.muvee.style"),
("sxi", "application/vnd.sun.xml.impress"),
("xml", "application/xml"),
("wmf", "application/x-msmetafile"),
("psb", "application/vnd.3gpp.pic-bw-small"),
("taglet", "application/vnd.mynfc"),
("psd", "image/vnd.adobe.photoshop"),
("mxu", "video/vnd.mpegurl"),
("sql", "application/x-sql"),
("ppd", "application/vnd.cups-ppd"),
("odt", "application/vnd.oasis.opendocument.text"),
("dms", "application/octet-stream"),
("cer", "application/pkix-cert"),
("mpy", "application/vnd.ibm.minipay"),
("snf", "application/x-font-snf"),
("uvvm", "video/vnd.dece.mobile"),
("texi", "application/x-texinfo"),
("bin", "application/octet-stream"),
("h264", "video/h264"),
("jam", "application/vnd.jam"),
("cla", "application/vnd.claymore"),
("xm", "audio/xm"),
("dot", "application/msword"),
("dwf", "model/vnd.dwf"),
("ivu", "application/vnd.immervision-ivu"),
("skd", "application/vnd.koan"),
("mp2a", "audio/mpeg"),
("tcap", "application/vnd.3gpp2.tcap"),
("jnlp", "application/x-java-jnlp-file"),
("cdy", "application/vnd.cinderella"),
("ltf", "application/vnd.frogans.ltf"),
("bz", "application/x-bzip"),
("rar", "application/x-rar-compressed"),
("cif", "chemical/x-cif"),
("pfa", "application/x-font-type1"),
("wma", "audio/x-ms-wma"),
("fpx", "image/vnd.fpx"),
("apk", "application/vnd.android.package-archive"),
("dpg", "application/vnd.dpgraph"),
("wvx", "video/x-ms-wvx"),
("qps", "application/vnd.publishare-delta-tree"),
("karbon", "application/vnd.kde.karbon"),
("wml", "text/vnd.wap.wml"),
("udeb", "application/x-debian-package"),
("pgn", "application/x-chess-pgn"),
("c4p", "application/vnd.clonk.c4group"),
("uvvz", "application/vnd.dece.zip"),
("nns", "application/vnd.noblenet-sealer"),
("rl", "application/resource-lists+xml"),
("docx", "application/vnd.openxmlformats-officedocument.wordprocessingml.document"),
("rif", "application/reginfo+xml"),
("ms", "text/troff"),
("cu", "application/cu-seeme"),
("xfdl", "application/vnd.xfdl"),
("uvvi", "image/vnd.dece.graphic"),
("aam", "application/x-authorware-map"),
("cdmic", "application/cdmi-container"),
("odc", "application/vnd.oasis.opendocument.chart"),
("boz", "application/x-bzip2"),
("ram", "audio/x-pn-realaudio"),
("asf", "video/x-ms-asf"),
("uvd", "application/vnd.dece.data"),
("ext", "application/vnd.novadigm.ext"),
("swf", "application/x-shockwave-flash"),
("msh", "model/mesh"),
("hbci", "application/vnd.hbci"),
("doc", "application/msword"),
("davmount", "application/davmount+xml"),
("wg", "application/vnd.pmi.widget"),
("zmm", "application/vnd.handheld-entertainment+xml"),
("tfm", "application/x-tex-tfm"),
("iso", "application/x-iso9660-image"),
("ggt", "application/vnd.geogebra.tool"),
("fig", "application/x-xfig"),
("bh2", "application/vnd.fujitsu.oasysprs"),
("rip", "audio/vnd.rip"),
("nnw", "application/vnd.noblenet-web"),
("oa3", "application/vnd.fujitsu.oasys3"),
("p7m", "application/pkcs7-mime"),
("cryptonote", "application/vnd.rig.cryptonote"),
("lnk", "application/x-ms-shortcut"),
("gqs", "application/vnd.grafeq"),
("flv", "video/x-flv"),
("uvm", "video/vnd.dece.mobile"),
("xyz", "chemical/x-xyz"),
("qam", "application/vnd.epson.quickanime"),
("nsf", "application/vnd.lotus-notes"),
("oga", "audio/ogg"),
("flo", "application/vnd.micrografx.flo"),
("grv", "application/vnd.groove-injector"),
("sru", "application/sru+xml"),
("irm", "application/vnd.ibm.rights-management"),
("wpd", "application/vnd.wordperfect"),
("ppam", "application/vnd.ms-powerpoint.addin.macroenabled.12"),
("fh5", "image/x-freehand"),
("hh", "text/x-c"),
("3g2", "video/3gpp2"),
("kpt", "application/vnd.kde.kpresenter"),
("uvvv", "video/vnd.dece.video"),
("texinfo", "application/x-texinfo"),
("zir", "application/vnd.zul"),
("prf", "application/pics-rules"),
("msi", "application/x-msdownload"),
("uvs", "video/vnd.dece.sd"),
("aas", "application/x-authorware-seg"),
("qwt", "application/vnd.quark.quarkxpress"),
("kpr", "application/vnd.kde.kpresenter"),
("nitf", "application/vnd.nitf"),
("ice", "x-conference/x-cooltalk"),
("spl", "application/x-futuresplash"),
("xlw", "application/vnd.ms-excel"),
("ogg", "audio/ogg"),
("obd", "application/x-msbinder"),
("skt", "application/vnd.koan"),
("distz", "application/octet-stream"),
("msf", "application/vnd.epson.msf"),
("prc", "application/x-mobipocket-ebook"),
("bdm", "application/vnd.syncml.dm+wbxml"),
("mpn", "application/vnd.mophun.application"),
("fg5", "application/vnd.fujitsu.oasysgp"),
("edx", "application/vnd.novadigm.edx"),
("tao", "application/vnd.tao.intent-module-archive"),
("ico", "image/x-icon"),
("mdb", "application/x-msaccess"),
("com", "application/x-msdownload"),
("mov", "video/quicktime"),
("sgi", "image/sgi"),
("wbmp", "image/vnd.wap.wbmp"),
("fhc", "image/x-freehand"),
("sfd-hdstx", "application/vnd.hydrostatix.sof-data"),
("cil", "application/vnd.ms-artgalry"),
("xlt", "application/vnd.ms-excel"),
("mpg", "video/mpeg"),
("webp", "image/webp"),
("pfx", "application/x-pkcs12"),
("wks", "application/vnd.ms-works"),
("aw", "application/applixware"),
("hvs", "application/vnd.yamaha.hv-script"),
("xhtml", "application/xhtml+xml"),
("spc", "application/x-pkcs7-certificates"),
("elc", "application/octet-stream"),
("gex", "application/vnd.geometry-explorer"),
("conf", "text/plain"),
("cbr", "application/x-cbr"),
("ktz", "application/vnd.kahootz"),
("uri", "text/uri-list"),
("ac", "application/pkix-attr-cert"),
("ustar", "application/x-ustar"),
("cml", "chemical/x-cml"),
("ris", "application/x-research-info-systems"),
("ics", "text/calendar"),
("n3", "text/n3"),
("vcg", "application/vnd.groove-vcard"),
("dfac", "application/vnd.dreamfactory"),
("install", "application/x-install-instructions"),
("sdp", "application/sdp"),
("mkv", "video/x-matroska"),
("bz2", "application/x-bzip2"),
("uvx", "application/vnd.dece.unspecified"),
("sis", "application/vnd.symbian.install"),
("mcurl", "text/vnd.curl.mcurl"),
("xpw", "application/vnd.intercon.formnet"),
("hps", "application/vnd.hp-hps"),
("odi", "application/vnd.oasis.opendocument.image"),
("dmg", "application/x-apple-diskimage"),
("odm", "application/vnd.oasis.opendocument.text-master"),
("onepkg", "application/onenote"),
("xdp", "application/vnd.adobe.xdp+xml"),
("irp", "application/vnd.irepository.package+xml"),
("cba", "application/x-cbr"),
("ppsx", "application/vnd.openxmlformats-officedocument.presentationml.slideshow"),
("dtshd", "audio/vnd.dts.hd"),
("dis", "application/vnd.mobius.dis"),
("xltx", "application/vnd.openxmlformats-officedocument.spreadsheetml.template"),
("xsl", "application/xml"),
("rdf", "application/rdf+xml"),
("mime", "message/rfc822"),
("fsc", "application/vnd.fsc.weblaunch"),
("mscml", "application/mediaservercontrol+xml"),
("mjp2", "video/mj2"),
("mbox", "application/mbox"),
("pct", "image/x-pict"),
("wgt", "application/widget"),
("trm", "application/x-msterminal"),
("rld", "application/resource-lists-diff+xml"),
("roa", "application/rpki-roa"),
("oprc", "application/vnd.palm"),
("atom", "application/atom+xml"),
("sm", "application/vnd.stepmania.stepchart"),
("wmls", "text/vnd.wap.wmlscript"),
("silo", "model/mesh"),
("mpm", "application/vnd.blueice.multipass"),
("xer", "application/patch-ops-error+xml"),
("sdw", "application/vnd.stardivision.writer"),
("z8", "application/x-zmachine"),
("spot", "text/vnd.in3d.spot"),
("hpgl", "application/vnd.hp-hpgl"),
("bcpio", "application/x-bcpio"),
("arc", "application/x-freearc"),
("teicorpus", "application/tei+xml"),
("nfo", "text/x-nfo"),
("rs", "application/rls-services+xml"),
("pcf", "application/x-font-pcf"),
("xpm", "image/x-xpixmap"),
("sse", "application/vnd.kodak-descriptor"),
("gtar", "application/x-gtar"),
("ifb", "text/calendar"),
("kar", "audio/midi"),
("c4g", "application/vnd.clonk.c4group"),
("x3dvz", "model/x3d+vrml"),
("kfo", "application/vnd.kde.kformula"),
("m2a", "audio/mpeg"),
("res", "application/x-dtbresource+xml"),
("wmx", "video/x-ms-wmx"),
("apr", "application/vnd.lotus-approach"),
("movie", "video/x-sgi-movie"),
("au", "audio/basic"),
("ims", "application/vnd.ms-ims"),
("s3m", "audio/s3m"),
("igs", "model/iges"),
("mp21", "application/mp21"),
("ra", "audio/x-pn-realaudio"),
("afp", "application/vnd.ibm.modcap"),
("djvu", "image/vnd.djvu"),
("pls", "application/pls+xml"),
("meta4", "application/metalink4+xml"),
("flw", "application/vnd.kde.kivio"),
("cdmid", "application/cdmi-domain"),
("book", "application/vnd.framemaker"),
("gramps", "application/x-gramps-xml"),
("oth", "application/vnd.oasis.opendocument.text-web"),
("dvb", "video/vnd.dvb.file"),
("fbs", "image/vnd.fastbidsheet"),
("emf", "application/x-msmetafile"),
("pgp", "application/pgp-encrypted"),
("list3820", "application/vnd.ibm.modcap"),
("pcl", "application/vnd.hp-pcl"),
("wmlc", "application/vnd.wap.wmlc"),
("svd", "application/vnd.svd"),
("blorb", "application/x-blorb"),
("oda", "application/oda"),
("lha", "application/x-lzh-compressed"),
("zip", "application/zip"),
("npx", "image/vnd.net-fpx"),
("xpr", "application/vnd.is-xpr"),
("ppsm", "application/vnd.ms-powerpoint.slideshow.macroenabled.12"),
("mft", "application/rpki-manifest"),
("inkml", "application/inkml+xml"),
("csh", "application/x-csh"),
("btif", "image/prs.btif"),
("gdl", "model/vnd.gdl"),
("ftc", "application/vnd.fluxtime.clip"),
("mts", "model/vnd.mts"),
("ggb", "application/vnd.geogebra.file"),
("def", "text/plain"),
("ttf", "application/x-font-ttf"),
("hpid", "application/vnd.hp-hpid"),
("nbp", "application/vnd.wolfram.player"),
("fm", "application/vnd.framemaker"),
("otc", "application/vnd.oasis.opendocument.chart-template"),
("xfdf", "application/vnd.adobe.xfdf"),
("mgp", "application/vnd.osgeo.mapguide.package"),
("vss", "application/vnd.visio"),
("xpx", "application/vnd.intercon.formnet"),
("lzh", "application/x-lzh-compressed"),
("fli", "video/x-fli"),
("twd", "application/vnd.simtech-mindmapper"),
("fly", "text/vnd.fly"),
("h", "text/x-c"),
("mpga", "audio/mpeg"),
("tiff", "image/tiff"),
("esf", "application/vnd.epson.esf"),
("cdmiq", "application/cdmi-queue"),
("vsf", "application/vnd.vsf"),
("link66", "application/vnd.route66.link66+xml"),
("rpst", "application/vnd.nokia.radio-preset"),
("txt", "text/plain"),
("w3d", "application/x-director"),
("ufdl", "application/vnd.ufdl"),
("pwn", "application/vnd.3m.post-it-notes"),
("svc", "application/vnd.dvb.service"),
("pcx", "image/x-pcx"),
("sbml", "application/sbml+xml"),
("ser", "application/java-serialized-object"),
("skp", "application/vnd.koan"),
("mpg4", "video/mp4"),
("metalink", "application/metalink+xml"),
("ncx", "application/x-dtbncx+xml"),
("avi", "video/x-msvideo"),
("pps", "application/vnd.ms-powerpoint"),
("java", "text/x-java-source"),
("midi", "audio/midi"),
("dump", "application/octet-stream"),
("hdf", "application/x-hdf"),
("xlf", "application/x-xliff+xml"),
("see", "application/vnd.seemail"),
("ghf", "application/vnd.groove-help"),
("hqx", "application/mac-binhex40"),
("lbe", "application/vnd.llamagraphics.life-balance.exchange+xml"),
("sfv", "text/x-sfv"),
("portpkg", "application/vnd.macports.portpkg"),
("ez3", "application/vnd.ezpix-package"),
("vrml", "model/vrml"),
("m3u", "audio/x-mpegurl"),
("viv", "video/vnd.vivo"),
("jar", "application/java-archive"),
("latex", "application/x-latex"),
("xdm", "application/vnd.syncml.dm+xml"),
("caf", "audio/x-caf"),
("m1v", "video/mpeg"),
("z2", "application/x-zmachine"),
("jpe", "image/jpeg"),
("rpss", "application/vnd.nokia.radio-presets"),
("snd", "audio/basic"),
("xhvml", "application/xv+xml"),
("sgl", "application/vnd.stardivision.writer-global"),
("gam", "application/x-tads"),
("lbd", "application/vnd.llamagraphics.life-balance.desktop"),
("wbs", "application/vnd.criticaltools.wbs+xml"),
("cgm", "image/cgm"),
("uvt", "application/vnd.dece.ttml+xml"),
("setreg", "application/set-registration-initiation"),
("sldx", "application/vnd.openxmlformats-officedocument.presentationml.slide"),
("grxml", "application/srgs+xml"),
("azw", "application/vnd.amazon.ebook"),
("xla", "application/vnd.ms-excel"),
("xsm", "application/vnd.syncml+xml"),
("sema", "application/vnd.sema"),
("iota", "application/vnd.astraea-software.iota"),
("sda", "application/vnd.stardivision.draw"),
("asm", "text/x-asm"),
("kml", "application/vnd.google-earth.kml+xml"),
("pdb", "application/vnd.palm"),
("xo", "application/vnd.olpc-sugar"),
("curl", "text/vnd.curl")
])
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | code | 6806 | using Mux
using Test
using HTTP, MbedTLS
import HTTP: StatusError, WebSockets
println("Mux")
@testset "Mux" begin
println("misc")
@testset "misc" begin
function f()
@app foo = (Mux.defaults)
end
@test f() === nothing
@test Mux.notfound()(Dict())[:status] == 404
end
println("basic server")
@testset "basic server" begin
d1 = Dict("one"=> "1", "two"=> "2")
d2 = Dict("one"=> "1", "two"=> "")
@app test = (
Mux.defaults,
page("/",respond("<h1>Hello World!</h1>")),
page("/about", respond("<h1>Boo!</h1>")),
page("/user/:user", req -> "<h1>Hello, $(req[:params][:user])!</h1>"),
query(d1, respond("<h1>query1</h1>")),
query(d2, respond("<h1>query2</h1>")),
Mux.notfound())
serve(test)
println("page")
@testset "page" begin
@test String(HTTP.get("http://localhost:8000").body) ==
"<h1>Hello World!</h1>"
@test String(HTTP.get("http://localhost:8000/about").body) ==
"<h1>Boo!</h1>"
@test String(HTTP.get("http://localhost:8000/user/julia").body) ==
"<h1>Hello, julia!</h1>"
end
println("query")
@testset "query" begin
@test String(HTTP.get("http://localhost:8000/dum?one=1&two=2").body) ==
"<h1>query1</h1>"
@test_throws StatusError String(HTTP.get("http://localhost:8000/dum?one=1").body)
@test_throws StatusError String(HTTP.get("http://localhost:8000/dum?one=1&two=2&sarv=boo").body)
@test_throws StatusError String(HTTP.get("http://localhost:8000/dum?one=1").body)
@test String(HTTP.get("http://localhost:8000/dum?one=1&two=56").body) ==
"<h1>query2</h1>"
@test String(HTTP.get("http://localhost:8000/dum?one=1&two=hfjd").body) ==
"<h1>query2</h1>"
@test_throws StatusError String(HTTP.get("http://localhost:8000/dum?one=1").body)
@test_throws StatusError String(HTTP.get("http://localhost:8000/dum?one=1&two=2&sarv=boo").body)
end
end
println("MIME types")
@testset "MIME types" begin
# Issue #68
@test Mux.fileheaders("foo.css")["Content-Type"] == "text/css"
@test Mux.fileheaders("foo.html")["Content-Type"] == "text/html"
@test Mux.fileheaders("foo.js")["Content-Type"] == "application/javascript"
end
# Check that prod_defaults don't completely break things
# And check prod_defaults error handler
println("prod defaults")
@testset "prod defaults" begin
throwapp() = (_...) -> error("An error!")
# Used for wrapping stderrcatch so we can check its output and stop it spewing
# all over the test results.
path, mock_stderr = mktemp()
@app test = (
Mux.prod_defaults,
(app, req) -> redirect_stderr(() -> Mux.stderrcatch(app, req), mock_stderr),
page("/",respond("<h1>Hello World!</h1>")),
page("/about", respond("<h1>Boo!</h1>")),
page("/user/:user", req -> "<h1>Hello, $(req[:params][:user])!</h1>"),
throwapp(),
Mux.notfound())
serve(test, 8001)
@test String(HTTP.get("http://localhost:8001").body) ==
"<h1>Hello World!</h1>"
@test String(HTTP.get("http://localhost:8001/about").body) ==
"<h1>Boo!</h1>"
@test String(HTTP.get("http://localhost:8001/user/julia").body) ==
"<h1>Hello, julia!</h1>"
@test String(HTTP.get("http://localhost:8001/badurl";
status_exception=false).body) ==
"Internal server error"
# Check our error was logged and close fake stderr
seekstart(mock_stderr)
@test occursin("An error!", read(mock_stderr, String))
close(mock_stderr)
rm(path)
end
# Test page and route are callable without a string argument
# (previously the first two raised StackOverflowError)
println("bare page()")
@testset "bare page()" begin
@test page(identity, identity) isa Function
@test route(identity, identity) isa Function
@test page(identity) isa Function
@test route(identity) isa Function
# Test you can pass the string last if you really want.
@test page(identity, "") isa Function
@test route(identity, "") isa Function
end
println("WebSockets")
@testset "WebSockets" begin
@app h = (
Mux.defaults,
page("/", respond("<h1>Hello World!</h1>")),
Mux.notfound());
@app w = (
Mux.wdefaults,
route("/ws_io", Mux.echo),
Mux.wclose,
Mux.notfound());
serve(h, w, 2333)
@test String(HTTP.get("http://localhost:2333/").body) ==
"<h1>Hello World!</h1>"
WebSockets.open("ws://localhost:2333/ws_io") do ws_client
message = "Hello WebSocket!"
WebSockets.send(ws_client, message)
str = WebSockets.receive(ws_client)
@test str == message
end
end
println("SSL/TLS")
@testset "SSL/TLS" begin
# Test that we can serve HTTP and websocket responses over TLS/SSL
@app h = (
Mux.defaults,
page("/", respond("<h1>Hello World!</h1>")),
Mux.notfound());
@app w = (
Mux.wdefaults,
route("/ws_io", Mux.echo),
Mux.wclose,
Mux.notfound());
cert = abspath(joinpath(dirname(pathof(Mux)), "../test", "test.cert"))
key = abspath(joinpath(dirname(pathof(Mux)), "../test", "test.key"))
serve(h, w, 2444; sslconfig=MbedTLS.SSLConfig(cert, key))
# require_ssl_verification means that the certificates won't be validated
# (checked against the certificate authority lists), but we will make proper
# TLS/SSL connections, so the tests are still useful.
http_response = HTTP.get("https://localhost:2444/"; require_ssl_verification=false)
@test String(http_response.body) == "<h1>Hello World!</h1>"
WebSockets.open("wss://localhost:2444/ws_io"; require_ssl_verification=false) do ws_client
message = "Hello WebSocket!"
WebSockets.send(ws_client, message)
str = WebSockets.receive(ws_client)
@test str == message
end
end
@testset "rename_mux_closures" begin
s1 = """
[4] prettystderrcatch(app::Mux.var"#1#2"{typeof(test), Mux.var"#1#2"{Mux.var"#5#6"{Mux.var"#33#34"{Vector{Any}}, Mux.var"#23#24"{String}}, Mux.var"#1#2"{Mux.var"#5#6"{Mux.var"#33#34"{Vector{SubString{String}}}, Mux.var"#1#2"{Mux.var"#5#6"{Mux.var"#37#38"{Float64}, Mux.var"#23#24"{String}}, Mux.var"#23#24"{String}}}, Mux.var"#1#2"{Mux.var"#5#6"{Mux.var"#33#34"{Vector{SubString{String}}}, var"#5#7"}, Mux.var"#1#2"{Mux.var"#21#22"{Mux.var"#25#26"{Symbol, Int64}}, Mux.var"#23#24"{String}}}}}}, req::Dict{Any, Any})"""
e1 = "[4] prettystderrcatch(app::Mux.Closure, req::Dict{Any, Any})"
@test Mux.rename_mux_closures(s1) == e1
s2 = """[21] (::Mux.var"#7#8"{Mux.App})(req::HTTP.Messages.Request)"""
e2 = """[21] (::Mux.Closure)(req::HTTP.Messages.Request)"""
@test Mux.rename_mux_closures(s2) == e2
# Replace multiple closures at once
@test Mux.rename_mux_closures(s1 * '\n' * s2) == e1 * '\n' * e2
# Leave malformed input alone
s3 = """gabble (::Mux.var"#7#8{ gabble"""
@test Mux.rename_mux_closures(s3) == s3
end
end
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"MIT"
] | 1.0.2 | 7295d849103ac4fcbe3b2e439f229c5cc77b9b69 | docs | 8327 | # Mux.jl
[](https://travis-ci.com/JuliaWeb/Mux.jl)
[](https://ci.appveyor.com/project/shashi/mux-jl/branch/master)
[](https://codecov.io/github/JuliaWeb/Mux.jl?branch=master)
```jl
Pkg.add("Mux")
```
Mux.jl gives your Julia web services some closure. Mux allows you to
define servers in terms of highly modular and composable components
called middleware, with the aim of making both simple and complex
servers as simple as possible to throw together.
For example:
```jl
using Mux
@app test = (
Mux.defaults,
page(respond("<h1>Hello World!</h1>")),
page("/about",
probability(0.1, respond("<h1>Boo!</h1>")),
respond("<h1>About Me</h1>")),
page("/user/:user", req -> "<h1>Hello, $(req[:params][:user])!</h1>"),
Mux.notfound())
serve(test)
```
You can run this demo by entering the successive forms into the Julia
REPL. The code displays a "hello, world" at `localhost:8000`, with an
about page at `/about` and another hello at `/user/[your name]`.
The `@app` macro allows the server to be redefined on the fly, and you
can test this by editing the `hello` text and re-evaluating. (don't
re-evalute `serve(test)`)
Note that `serve(test)` launches an asynchronous `Task` that will not prevent Julia from terminating.
This is good at the REPL, but not always what you want.
If you want Julia to wait for the task to finish, use `wait(serve(test))`.
## Technical Overview
Mux.jl is at heart a control flow library, with a [very small core](https://github.com/one-more-minute/Mux.jl/blob/master/src/Mux.jl#L7-L16). It's not important to understand that code exactly as long as you understand what it achieves.
There are three concepts core to Mux.jl: Middleware (which should be familiar
from the web libraries of other languages), stacking, and branching.
### Apps and Middleware
An *app* or *endpoint* is simply a function of a request which produces
a response:
```jl
function myapp(req)
return "<h1>Hello, $(req[:params][:user])!</h1>"
end
```
In principle this should say "hi" to our lovely user. But we have a
problem – where does the user's name come from? Ideally, our app
function doesn't need to know – it's simply handled at some point up the
chain (just the same as we don't parse the raw HTTP data, for example).
One way to solve this is via *middleware*. Say we get `:user` from a cookie:
```jl
function username(app, req)
req[:params][:user] = req[:cookies][:user]
return app(req) # We could also alter the response, but don't want to here
end
```
Middleware simply takes our request and modifies it appropriately, so
that data needed later on is available. This example is pretty trivial,
but we could equally have middleware which handles authentication and
encryption, processes cookies or file uploads, provides default headers,
and more.
We can then call our new version of the app like this:
```jl
username(myapp, req)
```
In fact, we can generate a whole new version of the app which has username
support built in:
```jl
function app2(req)
return username(myapp, req)
end
```
But if we have a lot of middleware, we're going to end up with a lot of `appX` functions.
For that reason we can use the `mux` function instead, which creates the new app for us:
```jl
mux(username, myapp)
```
This returns a *new* function which is equivalent to `app2` above. We
just didn't have to write it by hand.
### Stacking
Now suppose you have lots of middleware – one to parse the HTTP request into
a dict of properties, one to check user authentication, one to catch errors,
etc. `mux` handles this too – just pass it multiple arguments:
```jl
mux(todict, auth, catch_errors, app)
```
Again, `mux` returns a whole new app (a `request -> response` function)
for us, this time wrapped with the three middlewares we provided.
`todict` will be the first to make changes to the incoming request, and
the last to alter the outgoing response.
Another neat thing we can do is to compose middleware into more middleware:
```jl
mymidware = stack(todict, auth, catch_errors)
mux(mymidware, app)
```
This is effectively equivalent to the `mux` call above, but creating a
new middleware function from independent parts means we're able to
factor out our service to make things more readable. For example, Mux
provides the `Mux.default` middleware which is actually just a stack of
useful tools.
`stack` is self-flattening, i.e.
```jl
stack(a, b, c, d) == stack(a, stack(b, c), d) == stack(stack(a, b, c), d)
```
etc.
### Branching
Mux.jl goes further with middleware, and expresses routing and decisions
as middleware themselves. The key to this is the `branch` function,
which takes
1. a predicate to apply to the incoming request, and
2. an endpoint to run on the request if the predicate returns true.
For example:
```jl
mux(branch(_ -> rand() < 0.1, respond("Hello")),
respond("Hi"))
```
In this example, we ignore the request and simply return true 10% of the time.
You can test this in the repl by calling
```jl
mux(branch(_ -> rand() < 0.1, respond("Hello")),
respond("Hi"))(nothing)
```
(since the request is ignored anyway, it doesn't matter if we set it to `nothing`).
We can also define a function to wrap the branch
```jl
probability(x, app) = branch(_ -> rand() < x, app)
```
### Utilities
Despite the fact that endpoints and middleware are so important in Mux,
it's common to not write them by hand. For example, `respond("hi")`
creates a function `_ -> "hi"` which can be used as an endpoint.
Equally, functions like `status(404)` will create middleware which
applies the given status to the response. Mux.jl's "not found" endpoint
is therefore defined as
```jl
notfound(s = "Not found") = mux(status(404), respond(s))
```
which is a much more declarative approach.
For example:
* `respond(x)` – creates an endpoint that responds with `x`, regardless of the request.
* `route("/path/here", app)` – branches to `app` if the request location matches `"/path/here"`.
* `page("/path/here", app)` – branches to `app` if the request location *exactly* matches `"/path/here"`
## Serving static files from a package
Please use [AssetRegistry.jl](https://github.com/JuliaGizmos/AssetRegistry.jl) to
register an assets directory.
**DEPRECATED**: The `Mux.pkgfiles` middleware (included in `Mux.defaults`) serves static
files under the `assets` directory in any Julia package at `/pkg/<PACKAGE>/`.
## Integrate with WebSocket
You can easily integrate a general HTTP server and a WebSocket server with Mux.
To do so, define two apps, one for regular HTTP requests, and another that will handle WebSocket connections.
Here is a complete example:
```julia
using Mux
# HTTP Server
@app h = (
Mux.defaults,
page("/", respond("<h1>Hello World!</h1>")),
Mux.notfound());
function websocket_example(x)
sock = x[:socket]
for str in sock
println("client -> server: ", str)
send(sock, "I'm hard of hearing, did you say '$str'?")
end
end
# WebSocket server
@app w = (
Mux.wdefaults,
route("/ws_io", websocket_example),
Mux.wclose,
Mux.notfound());
# Serve both servers on the same port.
serve(h, w, 2333)
```
And finally, run a client, optionally in another process:
```julia
using Mux.WebSockets
WebSockets.open("ws://localhost:2333/ws_io") do ws
send(ws, "Hello World!")
data = receive(ws)
println(stderr, "server -> client: ", data)
end;
```
Now, if you run both programs, you'll see two `Hello World` messages, as the
server sends the same message back to the client.
## Using Mux in Production
While Mux should be perfectly useable in a Production environment, it is not
recommended to use the `Mux.defaults` stack for a Production application. The
`basiccatch` middleware it includes will dump potentially sensitive stacktraces
to the client on error, which is probably not what you want to be serving to
your clients! An alternative `Mux.prod_defaults` stack is available for
Production applications, which is just `Mux.defaults` with a `stderrcatch`
middleware instead (which dumps errors to stderr).
| Mux | https://github.com/JuliaWeb/Mux.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 779 | using Documenter, FlexPlan
makedocs(
modules = [FlexPlan],
sitename = "FlexPlan",
warnonly = :missing_docs,
pages = [
"Home" => "index.md"
"Manual" => [
"Installation" => "installation.md"
"Examples" => "examples.md"
"Tutorial" => "tutorial.md"
]
"Library" => [
"Problem types" => "problem_types.md"
"Network formulations" => "network_formulations.md"
"Multiperiod, multistage modelling" => [
"Modelling assumptions" => "modeling_assumptions.md"
"Model dimensions" => "dimensions.md"
]
"Data model" => "data_model.md"
]
]
)
deploydocs(
repo = "github.com/Electa-Git/FlexPlan.jl.git"
)
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 2419 | # Test of the transmission and distribution combined model
## Import packages
using Memento
_LOGGER = Logger(first(splitext(basename(@__FILE__)))) # A logger for this script, also used by included files.
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl")) # Include sample data from FlexPlan repository; you can of course also use your own data
## Set up solver
import HiGHS
optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
#import CPLEX
#optimizer = _FP.optimizer_with_attributes(CPLEX.Optimizer, "CPXPARAM_ScreenOutput"=>0)
direct_model = false # Whether to construct JuMP models using JuMP.direct_model() instead of JuMP.Model(). direct_model is only supported by some solvers.
## Set script parameters
number_of_hours = 4
number_of_scenarios = 2
number_of_years = 1
t_model_type = _PM.DCPPowerModel
d_model_type = _FP.BFARadPowerModel
t_setting = Dict("conv_losses_mp" => false)
d_setting = Dict{String,Any}()
## Set up logging
setlevel!.(Memento.getpath(getlogger(_FP)), "debug") # FlexPlan logger verbosity level. Useful values: "info", "debug", "trace"
time_start = time()
## Load data
# JSON files containing either transmission or distribution networks can be loaded with
# `data = _FP.convert_JSON(file_path)`; those containing both transmission and distribution
# networks can be loaded with `t_data, d_data = _FP.convert_JSON_td(file_path)`.
# Transmission network data
t_data = load_case6(; number_of_hours, number_of_scenarios, number_of_years)
# Distribution network 1 data
d_data_sub_1 = load_ieee_33(; number_of_hours, number_of_scenarios, number_of_years)
d_data_sub_1["t_bus"] = 3 # States that this distribution network is attached to bus 3 of transmission network
# Distribution network 2 data
d_data_sub_2 = deepcopy(d_data_sub_1)
d_data_sub_2["t_bus"] = 6
d_data = [d_data_sub_1, d_data_sub_2]
## Solve problem
result = _FP.simple_stoch_flex_tnep(t_data, d_data, t_model_type, d_model_type, optimizer; t_setting, d_setting, direct_model)
@assert result["termination_status"] ∈ (_FP.OPTIMAL, _FP.LOCALLY_SOLVED) "$(result["optimizer"]) termination status: $(result["termination_status"])"
notice(_LOGGER, "Script completed in $(round(time()-time_start;sigdigits=3)) seconds.")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3553 | # Example script to run multi-period optimization of demand flexibility, AC & DC lines and storage investments for the Italian case
## Import relevant packages
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/create_profile.jl")) # Include sample data from FlexPlan repository; you can of course also use your own data
# Add solver packages
# > Note: solver packages are needed to handle communication between the solver and JuMP;
# > the commercial ones do not include the solver itself.
import HiGHS
optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
#import CPLEX
#optimizer = _FP.optimizer_with_attributes(CPLEX.Optimizer, "CPXPARAM_ScreenOutput"=>0)
## Input parameters
number_of_hours = 24 # Number of time points
planning_horizon = 10 # Years to scale generation costs
file = joinpath(_FP_dir,"test/data/case6/case6_2030.m") # Input case, in Matpower m-file format: here 6-bus case with candidate AC, DC lines and candidate storage
scenario_properties = Dict(
1 => Dict{String,Any}("probability"=>0.5, "start"=>1514764800000), # 1514764800000 is 2018-01-01T00:00, needed by `create_profile_data_italy!` when `mc=false`
2 => Dict{String,Any}("probability"=>0.5, "start"=>1546300800000), # 1546300800000 is 2019-01-01T00:00, needed by `create_profile_data_italy!` when `mc=false`
)
scenario_metadata = Dict{String,Any}("mc"=>false) # Needed by `create_profile_data_italy!`
out_dir = mkpath("output")
## Load test case
data = _FP.parse_file(file) # Parse input file to obtain data dictionary
_FP.add_dimension!(data, :hour, number_of_hours) # Add dimension, e.g. hours
_FP.add_dimension!(data, :scenario, scenario_properties; metadata=scenario_metadata) # Add dimension, e.g. scenarios
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>planning_horizon)) # Add_dimension, e.g. years
_FP.scale_data!(data) # Scale investment & operational cost data based on planning years & hours
data, loadprofile, genprofile = create_profile_data_italy!(data) # Load time series data based demand and RES profiles of the six market zones in Italy from the data folder
time_series = create_profile_data(number_of_hours*_FP.dim_length(data,:scenario), data, loadprofile, genprofile) # Create time series data to be passed to the data dictionay
mn_data = _FP.make_multinetwork(data, time_series) # Create the multinetwork data dictionary
## Plot all candidates pre-optimization
plot_settings = Dict("add_nodes" => true, "plot_result_only" => false)
plot_filename = joinpath(out_dir,"candidates_italy.kml")
_FP.plot_geo_data(mn_data, plot_filename, plot_settings)
## Solve the planning problem
# PowerModels(ACDC) and FlexPlan settings
s = Dict("conv_losses_mp" => false, "add_co2_cost" => false)
# Build optimisation model, solve it and write solution dictionary:
# This is the "problem file" which needs to be constructed individually depending on application
# In this case: multi-period optimisation of demand flexibility, AC & DC lines and storage investments
println("Solving planning problem...")
result = _FP.stoch_flex_tnep(mn_data, _PM.DCPPowerModel, optimizer; setting = s)
## Plot final topology
plot_settings = Dict("add_nodes" => true, "plot_solution_only" => true)
plot_filename = joinpath(out_dir,"stoch.kml")
_FP.plot_geo_data(mn_data, plot_filename, plot_settings; solution = result)
println("Test completed")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3001 | module FlexPlan
## Imports
import Memento
import JuMP
import InfrastructureModels as _IM
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
## Memento settings
# Create our module level logger (this will get precompiled)
const _LOGGER = Memento.getlogger(@__MODULE__)
# Register the module level logger at runtime so that folks can access the logger via `getlogger(FlexPlan)`
# NOTE: If this line is not included then the precompiled `FlexPlan._LOGGER` won't be registered at runtime.
__init__() = Memento.register(_LOGGER)
## Includes
include("prob/storage_tnep.jl")
include("prob/flexible_tnep.jl")
include("prob/stochastic_flexible_tnep.jl")
include("prob/simple_stochastic_flexible_tnep.jl")
include("io/parse.jl")
include("io/scale.jl")
include("io/time_series.jl")
include("io/multinetwork.jl")
include("io/plot_geo_data.jl")
include("core/types.jl")
include("core/dimensions.jl")
include("core/variable.jl")
include("core/variableconv.jl")
include("core/variabledcgrid.jl")
include("core/gen.jl")
include("core/flexible_demand.jl")
include("core/storage.jl")
include("core/objective.jl")
include("core/ref_extension.jl")
include("core/constraint_template.jl")
include("core/constraint.jl")
include("core/line_replacement.jl")
include("core/distribution.jl")
include("core/td_coupling.jl")
include("core/solution.jl")
include("form/bf.jl")
include("form/bfarad.jl")
include("formconv/dcp.jl")
## Submodules
include("json_converter/json_converter.jl")
using .JSONConverter
include("benders/benders.jl")
using .Benders
include("td_decoupling/td_decoupling.jl")
using .TDDecoupling
## Exports
# FlexPlan exports everything except internal symbols, which are defined as those whose name
# starts with an underscore. If you don't want all of these symbols in your environment,
# then use `import FlexPlan` instead of `using FlexPlan`.
# Do not add FlexPlan-defined symbols to this exclude list. Instead, rename them with an
# underscore.
const _EXCLUDE_SYMBOLS = [Symbol(@__MODULE__), :eval, :include]
for sym in names(@__MODULE__, all=true)
sym_string = string(sym)
if sym in _EXCLUDE_SYMBOLS || startswith(sym_string, "_") || startswith(sym_string, "@_")
continue
end
if !(Base.isidentifier(sym) || (startswith(sym_string, "@") &&
Base.isidentifier(sym_string[2:end])))
continue
end
#println("$(sym)")
@eval export $sym
end
# The following items are also exported for user-friendlyness when calling `using FlexPlan`,
# so that users do not need to import JuMP to use a solver with FlexPlan.
import JuMP: optimizer_with_attributes
export optimizer_with_attributes
import JuMP: TerminationStatusCode
export TerminationStatusCode
import JuMP: ResultStatusCode
export ResultStatusCode
for status_code_enum in [TerminationStatusCode, ResultStatusCode]
for status_code in instances(status_code_enum)
@eval import JuMP: $(Symbol(status_code))
@eval export $(Symbol(status_code))
end
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 238 | module Benders
export run_benders_decomposition
using ..FlexPlan
const _FP = FlexPlan
import ..FlexPlan: _IM, _PM, _LOGGER
import JuMP
import Memento
using Printf
include("common.jl")
include("classical.jl")
include("modern.jl")
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 7399 | # Classical implementation of Benders decomposition
"""
Classical <: BendersAlgorithm
Parameters for classical implementation of Benders decomposition:
- `obj_rtol`: stop when `(ub-lb)/abs(ub) < obj_rtol`, where `ub` and `lb` are the upper and
lower bounds of the optimal solution value. Default: `sqrt(eps())`.
- `max_iter`: maximum number of iterations before stopping. Default: 1000.
- `tightening_rtol`: add an optimality cut only if
`(sp_obj-sp_obj_lb)/abs(sp_obj) > tightening_rtol`, where `sp_obj` is the objective
function value of the secondary problem and and `sp_obj_lb` is the value of the
corresponding surrogate function. Default: `sqrt(eps())`.
- `sp_obj_lb_min`: constant term of the initial optimality cut, which prevents the main
problem from being unbounded at the beginning. Default: `-1e12`."
- `silent`: require the solvers to produce no output; take precedence over any other
attribute controlling verbosity. Default: `true`.
"""
struct Classical <: BendersAlgorithm
@benders_fields
obj_rtol::Float64
end
function Classical(;
obj_rtol = sqrt(eps()),
max_iter = 1000,
tightening_rtol = sqrt(eps()),
sp_obj_lb_min = -1e12,
silent = true
)
Classical(
max_iter,
tightening_rtol,
sp_obj_lb_min,
silent,
obj_rtol
)
end
"""
run_benders_decomposition(algo::Classical, <arguments>, <keyword arguments>)
Run the classical implementation of Benders decomposition, where the main problem is solved once per iteration.
"""
function run_benders_decomposition(
algo::Classical,
data::Dict{String,<:Any},
model_type::Type,
main_opt::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
sec_opt::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
main_bm::Function,
sec_bm::Function;
ref_extensions::Vector{<:Function} = Function[],
solution_processors::Vector{<:Function} = Function[],
kwargs...
)
time_procedure_start = time()
Memento.debug(_LOGGER, "Classical Benders decomposition started. Available threads: $(Threads.nthreads()).")
pm_main, pm_sec, num_sp, sp_obj_lb_var = instantiate_model(algo, data, model_type, main_opt, sec_opt, main_bm, sec_bm; ref_extensions, kwargs...)
ub = Inf
lb = -Inf
iter = 0
current_best = true
best_main_var_values = nothing
stat = Dict{Int,Any}()
time_build = time() - time_procedure_start
while true
time_iteration_start = time()
iter += 1
time_main_start = time()
JuMP.optimize!(pm_main.model)
check_solution_main(pm_main)
main_var_values = get_var_values(pm_main)
time_main = time() - time_main_start
time_sec_start = time()
fix_and_optimize_secondary!(pm_sec, main_var_values)
time_sec = time() - time_sec_start
if iter == 1
if !JuMP.has_duals(first(pm_sec).model) # If this check passes here, no need to check again in subsequent iterations.
Memento.error(_LOGGER, "Solver $(JuMP.solver_name(first(pm_sec).model)) is unable to provide dual values.")
end
Memento.info(_LOGGER, "┏━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓")
Memento.info(_LOGGER, "┃ iter. cuts │ inv. cost oper. cost solution │ UB LB rel. gap ┃")
Memento.info(_LOGGER, "┠─────────────┼────────────────────────────────────┼────────────────────────────────────┨")
end
inv_cost, op_cost, sol_value, rel_tightening, lb = calc_iter_result(algo, pm_main, pm_sec, sp_obj_lb_var)
if sol_value < ub
ub = sol_value
current_best = true
best_main_var_values = main_var_values
else
current_best = false
end
rel_gap = (ub-lb)/abs(ub)
stop = rel_gap <= algo.obj_rtol || iter == algo.max_iter
if stop
iter_cuts = 0
else
iter_cuts = add_optimality_cuts!(pm_main, pm_sec, algo, num_sp, sp_obj_lb_var, main_var_values, rel_tightening)
end
time_iteration = time() - time_iteration_start # Time spent after this line is not measured
record_statistics!(stat, algo, iter, iter_cuts, inv_cost, op_cost, sol_value, ub, lb, rel_gap, current_best, main_var_values, pm_main, pm_sec, time_main, time_sec, time_iteration)
if stop
if rel_gap <= algo.obj_rtol
Memento.info(_LOGGER, "┠─────────────┴────────────────────────────────────┴────────────────────────────────────┨")
Memento.info(_LOGGER, "┃ Stopping: optimal within tolerance ▴ ┃")
Memento.info(_LOGGER, "┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛")
elseif iter == algo.max_iter
iter +=1 # To later distinguish whether the procedure reached optimality exactly after algo.max_iter iterations (above case) or did not reach optimality (this case)
Memento.info(_LOGGER, "┠─────────────┴────────────────────────────────────┴────────────────────────────────────┨")
Memento.info(_LOGGER, "┃ ▴ Stopping: iteration limit reached ┃")
Memento.info(_LOGGER, "┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛")
end
break
end
end
if !current_best
fix_main_var_values!(pm_main, best_main_var_values)
JuMP.optimize!(pm_main.model)
check_solution_main(pm_main)
fix_and_optimize_secondary!(pm_sec, best_main_var_values)
end
solution = build_solution(pm_main, pm_sec, solution_processors)
termination_status = iter > algo.max_iter ? JuMP.ITERATION_LIMIT : JuMP.OPTIMAL
build_result(ub, lb, solution, termination_status, stat, time_procedure_start, time_build)
end
function calc_iter_result(algo::Classical, pm_main, pm_sec, sp_obj_lb_var)
mp_obj = JuMP.objective_value(pm_main.model)
sp_obj = [JuMP.objective_value(pm.model) for pm in pm_sec]
sp_obj_lb = [JuMP.value(lb) for lb in sp_obj_lb_var]
rel_tightening = (sp_obj .- sp_obj_lb) ./ abs.(sp_obj)
inv_cost = mp_obj - sum(sp_obj_lb)
op_cost = sum(sp_obj)
sol_value = inv_cost + op_cost
lb = mp_obj
return inv_cost, op_cost, sol_value, rel_tightening, lb
end
function add_optimality_cuts!(pm_main, pm_sec, algo::Classical, num_sp, sp_obj_lb_var, main_var_values, rel_tightening)
iter_cuts = 0
for p in 1:num_sp
if rel_tightening[p] > algo.tightening_rtol
iter_cuts += 1
optimality_cut_expr = calc_optimality_cut(pm_main, pm_sec[p], main_var_values)
JuMP.@constraint(pm_main.model, sp_obj_lb_var[p] >= optimality_cut_expr)
end
end
return iter_cuts
end
function log_statistics(algo::Classical, st)
iter = st["iter"]
cuts = st["main"]["iter_cuts"]
st = st["value"]
Memento.info(_LOGGER, @sprintf("┃ %s%4i%6i │%11.3e%12.3e%12.3e │%11.3e%12.3e%12.3e ┃", st["current_best"] ? '•' : ' ', iter, cuts, st["inv_cost"], st["op_cost"], st["sol_value"], st["ub"], st["lb"], st["rel_gap"]))
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 11825 | """
Abstract type for Benders decomposition algorithms.
All concrete types shall have the following fields:
- `max_iter`: maximum number of iterations before stopping. Default: 1000.
- `tightening_rtol`: add an optimality cut only if `(sp_obj-sp_obj_lb)/abs(sp_obj) > tightening_rtol`, where `sp_obj` is the objective function value of the secondary problem and and `sp_obj_lb` is the value of the corresponding surrogate function. Default: `sqrt(eps())`.
- `sp_obj_lb_min`: constant term of the initial optimality cut, which prevents the main problem from being unbounded at the beginning. Default: `-1e12`."
- `silent`: require the solvers to produce no output; take precedence over any other attribute controlling verbosity. Default: `true`.
"""
abstract type BendersAlgorithm end
"A macro for adding the standard fields to a concrete BendersAlgorithm type"
_IM.@def benders_fields begin
max_iter::Int
tightening_rtol::Float64
sp_obj_lb_min::Float64
silent::Bool
end
"""
run_benders_decomposition(algo, data, model_type, main_opt, sec_opt, main_bm, sec_bm, <keyword arguments>)
Run Benders decomposition on `data` using the network model type `model_type`.
The algorithm implementation is specified by `algo` (see methods documentation).
Main and secondary problems are generated through `main_bm` and `sec_bm` build methods and
iteratively solved by `main_opt` and `sec_opt` optimizers.
The main problem must be formulated in such a way that, when at each iteration some of the
variables of the secondary problems are fixed at the values given by the current optimal
solution of the main problem, secondary problems are feasible.
# Arguments
- `ref_extensions::Vector{<:Function} = Function[]`: reference extensions, used to
instantiate both main and secondary problems.
- `solution_processors::Vector{<:Function} = Function[]`: solution processors, applied to
solutions of secondary problems.
- `kwargs...`: passed to `PowerModels.instantiate_model()` when building main and secondary
problems.
# Implementation
The objective function in `main_bm` must contain only the investment-related terms
(auxiliary variables for Benders optimality cuts are added later).
"""
function run_benders_decomposition end
## Utility functions
function combine_sol_dict!(d::AbstractDict, other::AbstractDict, atol=1e-5, path="")
for (k,v) in other
if haskey(d, k)
combine_sol_dict!(d[k], other[k], atol, "$path.$k")
else
d[k] = v
end
end
return d
end
function combine_sol_dict!(d::Number, other::Number, atol=1e-5, path="")
if isapprox(d, other; atol)
return d
else
Memento.error(_LOGGER, "Different values found while combining dicts at path \"$(path[2:end])\": $d, $other.")
end
end
function combine_sol_dict!(d, other, atol=1e-5, path="")
if d == other
return d
else
Memento.error(_LOGGER, "Different values found while combining dicts at path \"$(path[2:end])\": $d, $other.")
end
end
## Common auxiliary functions
function instantiate_model(algo, data, model_type, main_opt, sec_opt, main_bm, sec_bm; ref_extensions, kwargs...)
_FP.require_dim(data, :scenario, :year)
scen_year_ids = [(s,y) for y in 1:_FP.dim_length(data, :year) for s in 1:_FP.dim_length(data, :scenario)]
num_sp = length(scen_year_ids) # Number of secondary problems
pm_main = _PM.instantiate_model(data, model_type, main_bm; ref_extensions, kwargs...)
JuMP.set_optimizer(pm_main.model, main_opt)
if algo.silent
JuMP.set_silent(pm_main.model)
end
sp_obj_lb_var = JuMP.@variable(pm_main.model, [p=1:num_sp], lower_bound=algo.sp_obj_lb_min)
JuMP.@objective(pm_main.model, Min, JuMP.objective_function(pm_main.model) + sum(sp_obj_lb_var))
pm_sec = Vector{model_type}(undef, num_sp)
Threads.@threads for i in 1:num_sp
s, y = scen_year_ids[i]
scen_data = _FP.slice_multinetwork(data; scenario=s, year=y)
pm = pm_sec[i] = _PM.instantiate_model(scen_data, model_type, sec_bm; ref_extensions, kwargs...)
add_benders_mp_sp_nw_lookup!(pm, pm_main)
JuMP.relax_integrality(pm.model)
JuMP.set_optimizer(pm.model, sec_opt)
if algo.silent
JuMP.set_silent(pm.model)
end
end
Memento.debug(_LOGGER, "Main model has $(JuMP.num_variables(pm_main.model)) variables and $(sum([JuMP.num_constraints(pm_main.model, f, s) for (f,s) in JuMP.list_of_constraint_types(pm_main.model)])) constraints initially.")
Memento.debug(_LOGGER, "The first secondary model has $(JuMP.num_variables(first(pm_sec).model)) variables and $(sum([JuMP.num_constraints(first(pm_sec).model, f, s) for (f,s) in JuMP.list_of_constraint_types(first(pm_sec).model)])) constraints initially.")
return pm_main, pm_sec, num_sp, sp_obj_lb_var
end
function add_benders_mp_sp_nw_lookup!(one_pm_sec, pm_main)
mp_sp_nw_lookup = one_pm_sec.ref[:it][_PM.pm_it_sym][:slice]["benders_mp_sp_nw_lookup"] = Dict{Int,Int}()
slice_orig_nw_lookup = one_pm_sec.ref[:it][_PM.pm_it_sym][:slice]["slice_orig_nw_lookup"]
for n in _FP.nw_ids(one_pm_sec; hour=1)
orig_n = slice_orig_nw_lookup[n]
int_var_n = _FP.first_id(pm_main, orig_n, :scenario)
mp_sp_nw_lookup[int_var_n] = n
end
end
function fix_and_optimize_secondary!(pm_sec, main_var_values)
Threads.@threads for pm in pm_sec
fix_sec_var_values!(pm, main_var_values)
JuMP.optimize!(pm.model)
check_solution_secondary(pm)
end
end
function check_solution_main(pm)
if JuMP.termination_status(pm.model) ∉ (JuMP.OPTIMAL, JuMP.LOCALLY_SOLVED)
Memento.error(_LOGGER, "Main problem: $(JuMP.solver_name(pm.model)) termination status is $(JuMP.termination_status(pm.model)).")
end
end
function check_solution_secondary(pm)
if JuMP.termination_status(pm.model) ∉ (JuMP.OPTIMAL, JuMP.LOCALLY_SOLVED)
Memento.error(_LOGGER, "Secondary problem, scenario $(_FP.dim_meta(pm,:scenario,"orig_id")), year $(_FP.dim_meta(pm,:year,"orig_id")): $(JuMP.solver_name(pm.model)) termination status is $(JuMP.termination_status(pm.model)).")
end
end
function get_var_values(pm)
values = Dict{Int,Any}()
for n in _FP.nw_ids(pm, hour=1, scenario=1)
values_n = values[n] = Dict{Symbol,Any}()
for (key, var_array) in _PM.var(pm, n)
# idx is a JuMP.Containers.DenseAxisArrayKey{Tuple{Int64}}. idx[1] is an Int
values_n[key] = Dict{Int,Int}((idx[1],round(Int,JuMP.value(var_array[idx]))) for idx in keys(var_array))
end
end
return values
end
function fix_main_var_values!(pm, main_var_values)
for (n, key_var) in main_var_values
for (key, var) in key_var
for (idx, value) in var
z_main = _PM.var(pm, n, key, idx)
JuMP.fix(z_main, value; force=true)
end
end
end
end
function fix_sec_var_values!(pm, main_var_values)
for (main_nw_id, sec_nw_id) in pm.ref[:it][_PM.pm_it_sym][:slice]["benders_mp_sp_nw_lookup"]
for (key, var) in main_var_values[main_nw_id]
if haskey(_PM.var(pm, sec_nw_id), key)
for (idx, value) in var
z_sec = _PM.var(pm, sec_nw_id, key, idx)
JuMP.fix(z_sec, value; force=true)
end
end
end
end
end
function record_statistics!(stat, algo, iter, iter_cuts, inv_cost, op_cost, sol_value, ub, lb, rel_gap, current_best, main_var_values, pm_main, pm_sec, time_main, time_sec, time_iteration)
value = Dict{String,Any}()
value["inv_cost"] = inv_cost
value["op_cost"] = op_cost
value["sol_value"] = sol_value
value["ub"] = ub
value["lb"] = lb
value["rel_gap"] = rel_gap
value["current_best"] = current_best
main = Dict{String,Any}()
main["sol"] = main_var_values
main["iter_cuts"] = iter_cuts
main["nvar"] = JuMP.num_variables(pm_main.model)
main["ncon"] = sum([JuMP.num_constraints(pm_main.model, f, s) for (f,s) in JuMP.list_of_constraint_types(pm_main.model)])
secondary = Dict{String,Any}()
secondary["nvar"] = JuMP.num_variables(first(pm_sec).model)
secondary["ncon"] = sum([JuMP.num_constraints(first(pm_sec).model, f, s) for (f,s) in JuMP.list_of_constraint_types(first(pm_sec).model)])
time = Dict{String,Any}()
time["iteration"] = time_iteration
time["main"] = time_main
time["secondary"] = time_sec
time["other"] = time_iteration - (time_main + time_sec)
stat[iter] = Dict{String,Any}("iter" => iter, "value" => value, "main" => main, "secondary" => secondary, "time" => time)
log_statistics(algo, stat[iter])
end
function calc_optimality_cut(pm_main, one_pm_sec, main_var_values)
scen_id = _FP.dim_meta(one_pm_sec, :scenario, "orig_id")
year_id = _FP.dim_meta(one_pm_sec, :year, "orig_id")
optimality_cut_expr = JuMP.AffExpr(JuMP.objective_value(one_pm_sec.model))
for (main_nw_id, sec_nw_id) in one_pm_sec.ref[:it][_PM.pm_it_sym][:slice]["benders_mp_sp_nw_lookup"]
for (key, var) in main_var_values[main_nw_id]
if haskey(_PM.var(one_pm_sec, sec_nw_id), key)
for (idx, value) in var
z_main = _PM.var(pm_main, main_nw_id, key, idx)
z_sec = _PM.var(one_pm_sec, sec_nw_id, key, idx)
lam = JuMP.reduced_cost(z_sec)
JuMP.add_to_expression!(optimality_cut_expr, lam*(z_main-value))
Memento.trace(_LOGGER, @sprintf("Optimality cut term for (scenario%4i, year%2i): %15.1f * (%18s - %3.1f)", scen_id, year_id, lam, z_main, value))
end
end
end
end
return optimality_cut_expr
end
function build_solution(pm_main, pm_sec, solution_processors)
solution_main = _IM.build_solution(pm_main)
num_sp = length(pm_sec)
sol = Vector{Dict{String,Any}}(undef, num_sp)
Threads.@threads for p in 1:num_sp
sol[p] = _IM.build_solution(pm_sec[p]; post_processors=solution_processors)
lookup = pm_sec[p].ref[:it][_PM.pm_it_sym][:slice]["slice_orig_nw_lookup"]
nw_orig = Dict{String,Any}("$(lookup[parse(Int,n_slice)])"=>nw for (n_slice,nw) in sol[p]["nw"])
sol[p]["nw"] = nw_orig
end
solution_sec = Dict{String,Any}(k=>v for (k,v) in sol[1] if k != "nw")
solution_sec["nw"] = merge([s["nw"] for s in sol]...)
combine_sol_dict!(solution_sec, solution_main) # It is good that `solution_sec` is the first because 1) it has most of the data and 2) its integer values are rounded.
end
function build_result(ub, lb, solution, termination_status, stat, time_procedure_start, time_build)
result = Dict{String,Any}()
result["objective"] = ub
result["objective_lb"] = lb
result["solution"] = solution
result["termination_status"] = termination_status
result["stat"] = stat
time_proc = Dict{String,Any}()
time_proc["total"] = time() - time_procedure_start # Time spent after this line is not measured
time_proc["build"] = time_build
time_proc["main"] = sum(s["time"]["main"] for s in values(stat))
time_proc["secondary"] = sum(s["time"]["secondary"] for s in values(stat))
time_proc["other"] = time_proc["total"] - (time_proc["build"] + time_proc["main"] + time_proc["secondary"])
result["time"] = time_proc
Memento.debug(_LOGGER, @sprintf("Benders decomposition time: %.1f s (%.0f%% building models, %.0f%% main prob, %.0f%% secondary probs, %.0f%% other)",
time_proc["total"],
100 * time_proc["build"] / time_proc["total"],
100 * time_proc["main"] / time_proc["total"],
100 * time_proc["secondary"] / time_proc["total"],
100 * time_proc["other"] / time_proc["total"]
))
return result
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 8264 | # Modern implementation of Benders decomposition
"""
Modern <: BendersAlgorithm
Parameters for modern implementation of Benders decomposition:
- `max_iter`: maximum number of iterations before stopping. Default: 1000.
- `tightening_rtol`: add an optimality cut only if
`(sp_obj-sp_obj_lb)/abs(sp_obj) > tightening_rtol`, where `sp_obj` is the objective
function value of the secondary problem and and `sp_obj_lb` is the value of the
corresponding surrogate function. Default: `sqrt(eps())`.
- `sp_obj_lb_min`: constant term of the initial optimality cut, which prevents the main
problem from being unbounded at the beginning. Default: `-1e12`."
- `silent`: require the solvers to produce no output; take precedence over any other
attribute controlling verbosity. Default: `true`.
!!! info
The tolerance for stopping the procedure cannot be set here: it will coincide with the
stopping tolerance attribute(s) of the main problem optimizer.
"""
struct Modern <: BendersAlgorithm
@benders_fields
end
function Modern(;
max_iter = 1000,
tightening_rtol = sqrt(eps()),
sp_obj_lb_min = -1e12,
silent = true
)
Modern(
max_iter,
tightening_rtol,
sp_obj_lb_min,
silent
)
end
"""
run_benders_decomposition(algo::Modern, <arguments>, <keyword arguments>)
Run the modern implementation of Benders decomposition, where the main problem is solved once.
The modern implementation uses callbacks (lazy constraints) to solve secondary problems
whenever an optimal integer solution of the main problem is found.
"""
function run_benders_decomposition(
algo::Modern,
data::Dict{String,<:Any},
model_type::Type,
main_opt::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
sec_opt::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
main_bm::Function,
sec_bm::Function;
ref_extensions::Vector{<:Function} = Function[],
solution_processors::Vector{<:Function} = Function[],
kwargs...
)
########################################################################################
function optimality_cut_callback(cb_data)
iter += 1
if iter > algo.max_iter
if iter == algo.max_iter + 1
Memento.info(_LOGGER, "┠─────────────┴────────────────────────────────────┴────────────┨")
Memento.info(_LOGGER, "┃ ▴ Stopping: iteration limit reached ┃")
Memento.info(_LOGGER, "┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛")
end
return
end
status = JuMP.callback_node_status(cb_data, pm_main.model)
if status != JuMP.MOI.CALLBACK_NODE_STATUS_INTEGER
if status == JuMP.MOI.CALLBACK_NODE_STATUS_FRACTIONAL
Memento.warn(_LOGGER, "Benders callback called on fractional solution. Ignoring.")
return
else
@assert status == JuMP.MOI.CALLBACK_NODE_STATUS_UNKNOWN
Memento.error(_LOGGER, "Benders callback called on unknown solution status (might be fractional or integer).")
end
end
main_var_values = get_var_values(algo, pm_main, cb_data)
time_main = time() - time_main_start
time_sec_start = time()
fix_and_optimize_secondary!(pm_sec, main_var_values)
time_sec = time() - time_sec_start
if iter == 1
if !JuMP.has_duals(first(pm_sec).model) # If this check passes here, no need to check again in subsequent iterations.
Memento.error(_LOGGER, "Solver $(JuMP.solver_name(first(pm_sec).model)) is unable to provide dual values.")
end
Memento.info(_LOGGER, "┏━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━┓")
Memento.info(_LOGGER, "┃ iter. cuts │ inv. cost oper. cost solution │ UB ┃")
Memento.info(_LOGGER, "┠─────────────┼────────────────────────────────────┼────────────┨")
end
inv_cost, op_cost, sol_value, rel_tightening = calc_iter_result(algo, cb_data, mp_obj_expr, pm_sec, sp_obj_lb_var)
if sol_value < ub
ub = sol_value
current_best = true
else
current_best = false
end
iter_cuts = add_optimality_cuts!(pm_main, pm_sec, algo, num_sp, sp_obj_lb_var, main_var_values, rel_tightening; cb_data)
time_iteration = time() - time_iteration_start # Time spent after this line is not measured
record_statistics!(stat, algo, iter, iter_cuts, inv_cost, op_cost, sol_value, ub, NaN, NaN, current_best, main_var_values, pm_main, pm_sec, time_main, time_sec, time_iteration)
time_iteration_start = time_main_start = time()
end
########################################################################################
time_procedure_start = time()
Memento.debug(_LOGGER, "Modern Benders decomposition started. Available threads: $(Threads.nthreads()).")
pm_main, pm_sec, num_sp, sp_obj_lb_var = instantiate_model(algo, data, model_type, main_opt, sec_opt, main_bm, sec_bm; ref_extensions, kwargs...)
mp_obj_expr = JuMP.objective_function(pm_main.model)
JuMP.MOI.set(pm_main.model, JuMP.MOI.LazyConstraintCallback(), optimality_cut_callback)
ub = Inf
lb = -Inf
iter = 0
current_best = true
stat = Dict{Int,Any}()
time_build = time() - time_procedure_start
time_iteration_start = time_main_start = time()
JuMP.optimize!(pm_main.model) # Also solves secondary problems iteratively using the callback
check_solution_main(pm_main)
if iter <= algo.max_iter
Memento.info(_LOGGER, "┠─────────────┴────────────────────────────────────┴────────────┨")
Memento.info(_LOGGER, "┃ Stopping: optimal within tolerance ┃")
Memento.info(_LOGGER, "┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛")
end
best_main_var_values = get_var_values(pm_main)
fix_and_optimize_secondary!(pm_sec, best_main_var_values)
solution = build_solution(pm_main, pm_sec, solution_processors)
termination_status = iter > algo.max_iter ? JuMP.ITERATION_LIMIT : JuMP.OPTIMAL
build_result(ub, lb, solution, termination_status, stat, time_procedure_start, time_build)
end
function get_var_values(algo::Modern, pm, cb_data)
values = Dict{Int,Any}()
for n in _FP.nw_ids(pm, hour=1, scenario=1)
values_n = values[n] = Dict{Symbol,Any}()
for (key, var_array) in _PM.var(pm, n)
# idx is a JuMP.Containers.DenseAxisArrayKey{Tuple{Int64}}. idx[1] is an Int
values_n[key] = Dict{Int,Int}((idx[1],round(Int,JuMP.callback_value(cb_data, var_array[idx]))) for idx in keys(var_array))
end
end
return values
end
function calc_iter_result(algo::Modern, cb_data, mp_obj_expr, pm_sec, sp_obj_lb_var)
mp_obj = JuMP.callback_value(cb_data, mp_obj_expr)
sp_obj = [JuMP.objective_value(pm.model) for pm in pm_sec]
sp_obj_lb = [JuMP.callback_value(cb_data, lb) for lb in sp_obj_lb_var]
rel_tightening = (sp_obj .- sp_obj_lb) ./ abs.(sp_obj)
inv_cost = mp_obj - sum(sp_obj_lb)
op_cost = sum(sp_obj)
sol_value = inv_cost + op_cost
return inv_cost, op_cost, sol_value, rel_tightening
end
function add_optimality_cuts!(pm_main, pm_sec, algo::Modern, num_sp, sp_obj_lb_var, main_var_values, rel_tightening; cb_data)
iter_cuts = 0
for p in 1:num_sp
if rel_tightening[p] > algo.tightening_rtol
iter_cuts += 1
optimality_cut_expr = calc_optimality_cut(pm_main, pm_sec[p], main_var_values)
cut = JuMP.@build_constraint(sp_obj_lb_var[p] >= optimality_cut_expr)
JuMP.MOI.submit(pm_main.model, JuMP.MOI.LazyConstraint(cb_data), cut)
end
end
return iter_cuts
end
function log_statistics(algo::Modern, st)
iter = st["iter"]
cuts = st["main"]["iter_cuts"]
st = st["value"]
Memento.info(_LOGGER, @sprintf("┃ %s%4i%6i │%11.3e%12.3e%12.3e │%11.3e ┃", st["current_best"] ? '•' : ' ', iter, cuts, st["inv_cost"], st["op_cost"], st["sol_value"], st["ub"]))
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 6045 | # Constraint relating to network components or quantities not introduced by FlexPlan
## Power balance
# Power balance including candidate storage
function constraint_power_balance_acne_dcne_strg(pm::_PM.AbstractDCPModel, n::Int, i::Int, bus_arcs, bus_arcs_ne, bus_arcs_dc, bus_gens, bus_convs_ac, bus_convs_ac_ne, bus_loads, bus_shunts, bus_storage, bus_storage_ne, pd, qd, gs, bs)
p = _PM.var(pm, n, :p)
pg = _PM.var(pm, n, :pg)
pconv_grid_ac_ne = _PM.var(pm, n, :pconv_tf_fr_ne)
pconv_grid_ac = _PM.var(pm, n, :pconv_tf_fr)
pconv_ac = _PM.var(pm, n, :pconv_ac)
pconv_ac_ne = _PM.var(pm, n, :pconv_ac_ne)
p_ne = _PM.var(pm, n, :p_ne)
ps = _PM.var(pm, n, :ps)
ps_ne = _PM.var(pm, n, :ps_ne)
v = 1
JuMP.@constraint(pm.model, sum(p[a] for a in bus_arcs) + sum(p_ne[a] for a in bus_arcs_ne) + sum(pconv_grid_ac[c] for c in bus_convs_ac) + sum(pconv_grid_ac_ne[c] for c in bus_convs_ac_ne) == sum(pg[g] for g in bus_gens) - sum(ps[s] for s in bus_storage) -sum(ps_ne[s] for s in bus_storage_ne) - sum(pd[d] for d in bus_loads) - sum(gs[s] for s in bus_shunts)*v^2)
end
# Power balance (without DC equipment) including candidate storage
function constraint_power_balance_acne_strg(pm::_PM.AbstractWModels, n::Int, i::Int, bus_arcs, bus_arcs_ne, bus_gens, bus_loads, bus_shunts, bus_storage, bus_storage_ne, pd, qd, gs, bs)
p = _PM.var(pm, n, :p)
q = _PM.var(pm, n, :q)
p_ne = _PM.var(pm, n, :p_ne)
q_ne = _PM.var(pm, n, :q_ne)
pg = _PM.var(pm, n, :pg)
qg = _PM.var(pm, n, :qg)
ps = _PM.var(pm, n, :ps)
qs = _PM.var(pm, n, :qs)
ps_ne = _PM.var(pm, n, :ps_ne)
qs_ne = _PM.var(pm, n, :qs_ne)
w = _PM.var(pm, n, :w, i)
JuMP.@constraint(pm.model, sum(p[a] for a in bus_arcs) + sum(p_ne[a] for a in bus_arcs_ne) == sum(pg[g] for g in bus_gens) - sum(ps[s] for s in bus_storage) - sum(ps_ne[s] for s in bus_storage_ne) - sum(pd[d] for d in bus_loads) - sum(gs[s] for s in bus_shunts)*w)
JuMP.@constraint(pm.model, sum(q[a] for a in bus_arcs) + sum(q_ne[a] for a in bus_arcs_ne) == sum(qg[g] for g in bus_gens) - sum(qs[s] for s in bus_storage) - sum(qs_ne[s] for s in bus_storage_ne) - sum(qd[d] for d in bus_loads) + sum(bs[s] for s in bus_shunts)*w)
end
# Power balance including candidate storage & flexible demand
function constraint_power_balance_acne_dcne_flex(pm::_PM.AbstractDCPModel, n::Int, i::Int, bus_arcs, bus_arcs_ne, bus_arcs_dc, bus_gens, bus_convs_ac, bus_convs_ac_ne, bus_loads, bus_shunts, bus_storage, bus_storage_ne, gs, bs)
p = _PM.var(pm, n, :p)
pg = _PM.var(pm, n, :pg)
pconv_grid_ac_ne = _PM.var(pm, n, :pconv_tf_fr_ne)
pconv_grid_ac = _PM.var(pm, n, :pconv_tf_fr)
pconv_ac = _PM.var(pm, n, :pconv_ac)
pconv_ac_ne = _PM.var(pm, n, :pconv_ac_ne)
p_ne = _PM.var(pm, n, :p_ne)
ps = _PM.var(pm, n, :ps)
ps_ne = _PM.var(pm, n, :ps_ne)
pflex = _PM.var(pm, n, :pflex)
v = 1
JuMP.@constraint(pm.model, sum(p[a] for a in bus_arcs) + sum(p_ne[a] for a in bus_arcs_ne) + sum(pconv_grid_ac[c] for c in bus_convs_ac) + sum(pconv_grid_ac_ne[c] for c in bus_convs_ac_ne) == sum(pg[g] for g in bus_gens) - sum(ps[s] for s in bus_storage) -sum(ps_ne[s] for s in bus_storage_ne) - sum(pflex[d] for d in bus_loads) - sum(gs[s] for s in bus_shunts)*v^2)
end
# Power balance (without DC equipment) including candidate storage & flexible demand
function constraint_power_balance_acne_flex(pm::_PM.AbstractWModels, n::Int, i::Int, bus_arcs, bus_arcs_ne, bus_gens, bus_loads, bus_shunts, bus_storage, bus_storage_ne, gs, bs)
p = _PM.var(pm, n, :p)
q = _PM.var(pm, n, :q)
p_ne = _PM.var(pm, n, :p_ne)
q_ne = _PM.var(pm, n, :q_ne)
pg = _PM.var(pm, n, :pg)
qg = _PM.var(pm, n, :qg)
ps = _PM.var(pm, n, :ps)
qs = _PM.var(pm, n, :qs)
ps_ne = _PM.var(pm, n, :ps_ne)
qs_ne = _PM.var(pm, n, :qs_ne)
pflex = _PM.var(pm, n, :pflex)
qflex = _PM.var(pm, n, :qflex)
w = _PM.var(pm, n, :w, i)
JuMP.@constraint(pm.model, sum(p[a] for a in bus_arcs) + sum(p_ne[a] for a in bus_arcs_ne) == sum(pg[g] for g in bus_gens) - sum(ps[s] for s in bus_storage) - sum(ps_ne[s] for s in bus_storage_ne) - sum(pflex[d] for d in bus_loads) - sum(gs[s] for s in bus_shunts)*w)
JuMP.@constraint(pm.model, sum(q[a] for a in bus_arcs) + sum(q_ne[a] for a in bus_arcs_ne) == sum(qg[g] for g in bus_gens) - sum(qs[s] for s in bus_storage) - sum(qs_ne[s] for s in bus_storage_ne) - sum(qflex[d] for d in bus_loads) + sum(bs[s] for s in bus_shunts)*w)
end
## Candidate AC branches
# Activate a candidate AC branch depending on the investment decisions in the candidate's horizon.
function constraint_ne_branch_activation(pm::_PM.AbstractPowerModel, n::Int, i::Int, horizon::Vector{Int})
indicator = _PM.var(pm, n, :branch_ne, i)
investments = _PM.var.(Ref(pm), horizon, :branch_ne_investment, i)
JuMP.@constraint(pm.model, indicator == sum(investments))
end
## Candidate DC branches
# Activate a candidate DC branch depending on the investment decisions in the candidate's horizon.
function constraint_ne_branchdc_activation(pm::_PM.AbstractPowerModel, n::Int, i::Int, horizon::Vector{Int})
indicator = _PM.var(pm, n, :branchdc_ne, i)
investments = _PM.var.(Ref(pm), horizon, :branchdc_ne_investment, i)
JuMP.@constraint(pm.model, indicator == sum(investments))
end
## Candidate converters
# Activate a candidate AC/DC converter depending on the investment decisions in the candidate's horizon.
function constraint_ne_converter_activation(pm::_PM.AbstractPowerModel, n::Int, i::Int, horizon::Vector{Int})
indicator = _PM.var(pm, n, :conv_ne, i)
investments = _PM.var.(Ref(pm), horizon, :conv_ne_investment, i)
JuMP.@constraint(pm.model, indicator == sum(investments))
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 5905 | # Constraint templates relating to network components or quantities not introduced by FlexPlan
## Power balance
"Power balance including candidate storage"
function constraint_power_balance_acne_dcne_strg(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
bus = _PM.ref(pm, nw, :bus, i)
bus_arcs = _PM.ref(pm, nw, :bus_arcs, i)
bus_arcs_ne = _PM.ref(pm, nw, :ne_bus_arcs, i)
bus_arcs_dc = _PM.ref(pm, nw, :bus_arcs_dc, i)
bus_gens = _PM.ref(pm, nw, :bus_gens, i)
bus_convs_ac = _PM.ref(pm, nw, :bus_convs_ac, i)
bus_convs_ac_ne = _PM.ref(pm, nw, :bus_convs_ac_ne, i)
bus_loads = _PM.ref(pm, nw, :bus_loads, i)
bus_shunts = _PM.ref(pm, nw, :bus_shunts, i)
bus_storage = _PM.ref(pm, nw, :bus_storage, i)
bus_storage_ne = _PM.ref(pm, nw, :bus_storage_ne, i)
pd = Dict(k => _PM.ref(pm, nw, :load, k, "pd") for k in bus_loads)
qd = Dict(k => _PM.ref(pm, nw, :load, k, "qd") for k in bus_loads)
gs = Dict(k => _PM.ref(pm, nw, :shunt, k, "gs") for k in bus_shunts)
bs = Dict(k => _PM.ref(pm, nw, :shunt, k, "bs") for k in bus_shunts)
constraint_power_balance_acne_dcne_strg(pm, nw, i, bus_arcs, bus_arcs_ne, bus_arcs_dc, bus_gens, bus_convs_ac, bus_convs_ac_ne, bus_loads, bus_shunts, bus_storage, bus_storage_ne, pd, qd, gs, bs)
end
"Power balance (without DC equipment) including candidate storage"
function constraint_power_balance_acne_strg(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
bus = _PM.ref(pm, nw, :bus, i)
bus_arcs = _PM.ref(pm, nw, :bus_arcs, i)
bus_arcs_ne = _PM.ref(pm, nw, :ne_bus_arcs, i)
bus_gens = _PM.ref(pm, nw, :bus_gens, i)
bus_loads = _PM.ref(pm, nw, :bus_loads, i)
bus_shunts = _PM.ref(pm, nw, :bus_shunts, i)
bus_storage = _PM.ref(pm, nw, :bus_storage, i)
bus_storage_ne = _PM.ref(pm, nw, :bus_storage_ne, i)
pd = Dict(k => _PM.ref(pm, nw, :load, k, "pd") for k in bus_loads)
qd = Dict(k => _PM.ref(pm, nw, :load, k, "qd") for k in bus_loads)
gs = Dict(k => _PM.ref(pm, nw, :shunt, k, "gs") for k in bus_shunts)
bs = Dict(k => _PM.ref(pm, nw, :shunt, k, "bs") for k in bus_shunts)
constraint_power_balance_acne_strg(pm, nw, i, bus_arcs, bus_arcs_ne, bus_gens, bus_loads, bus_shunts, bus_storage, bus_storage_ne, pd, qd, gs, bs)
end
"Power balance including candidate storage & flexible demand"
function constraint_power_balance_acne_dcne_flex(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
bus = _PM.ref(pm, nw, :bus, i)
bus_arcs = _PM.ref(pm, nw, :bus_arcs, i)
bus_arcs_ne = _PM.ref(pm, nw, :ne_bus_arcs, i)
bus_arcs_dc = _PM.ref(pm, nw, :bus_arcs_dc, i)
bus_gens = _PM.ref(pm, nw, :bus_gens, i)
bus_convs_ac = _PM.ref(pm, nw, :bus_convs_ac, i)
bus_convs_ac_ne = _PM.ref(pm, nw, :bus_convs_ac_ne, i)
bus_loads = _PM.ref(pm, nw, :bus_loads, i)
bus_shunts = _PM.ref(pm, nw, :bus_shunts, i)
bus_storage = _PM.ref(pm, nw, :bus_storage, i)
bus_storage_ne = _PM.ref(pm, nw, :bus_storage_ne, i)
gs = Dict(k => _PM.ref(pm, nw, :shunt, k, "gs") for k in bus_shunts)
bs = Dict(k => _PM.ref(pm, nw, :shunt, k, "bs") for k in bus_shunts)
constraint_power_balance_acne_dcne_flex(pm, nw, i, bus_arcs, bus_arcs_ne, bus_arcs_dc, bus_gens, bus_convs_ac, bus_convs_ac_ne, bus_loads, bus_shunts, bus_storage, bus_storage_ne, gs, bs)
end
"Power balance (without DC equipment) including candidate storage & flexible demand"
function constraint_power_balance_acne_flex(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
bus = _PM.ref(pm, nw, :bus, i)
bus_arcs = _PM.ref(pm, nw, :bus_arcs, i)
bus_arcs_ne = _PM.ref(pm, nw, :ne_bus_arcs, i)
bus_gens = _PM.ref(pm, nw, :bus_gens, i)
bus_loads = _PM.ref(pm, nw, :bus_loads, i)
bus_shunts = _PM.ref(pm, nw, :bus_shunts, i)
bus_storage = _PM.ref(pm, nw, :bus_storage, i)
bus_storage_ne = _PM.ref(pm, nw, :bus_storage_ne, i)
gs = Dict(k => _PM.ref(pm, nw, :shunt, k, "gs") for k in bus_shunts)
bs = Dict(k => _PM.ref(pm, nw, :shunt, k, "bs") for k in bus_shunts)
constraint_power_balance_acne_flex(pm, nw, i, bus_arcs, bus_arcs_ne, bus_gens, bus_loads, bus_shunts, bus_storage, bus_storage_ne, gs, bs)
end
## AC candidate branches
"Activate a candidate AC branch depending on the investment decisions in the candidate's horizon."
function constraint_ne_branch_activation(pm::_PM.AbstractPowerModel, i::Int, prev_nws::Vector{Int}, nw::Int)
investment_horizon = [nw]
lifetime = _PM.ref(pm, nw, :ne_branch, i, "lifetime")
for n in Iterators.reverse(prev_nws[max(end-lifetime+2,1):end])
i in _PM.ids(pm, n, :ne_branch) ? push!(investment_horizon, n) : break
end
constraint_ne_branch_activation(pm, nw, i, investment_horizon)
end
## DC candidate branches
"Activate a candidate DC branch depending on the investment decisions in the candidate's horizon."
function constraint_ne_branchdc_activation(pm::_PM.AbstractPowerModel, i::Int, prev_nws::Vector{Int}, nw::Int)
investment_horizon = [nw]
lifetime = _PM.ref(pm, nw, :branchdc_ne, i, "lifetime")
for n in Iterators.reverse(prev_nws[max(end-lifetime+2,1):end])
i in _PM.ids(pm, n, :branchdc_ne) ? push!(investment_horizon, n) : break
end
constraint_ne_branchdc_activation(pm, nw, i, investment_horizon)
end
## Candidate converters
"Activate a candidate AC/DC converter depending on the investment decisions in the candidate's horizon."
function constraint_ne_converter_activation(pm::_PM.AbstractPowerModel, i::Int, prev_nws::Vector{Int}, nw::Int)
investment_horizon = [nw]
lifetime = _PM.ref(pm, nw, :convdc_ne, i, "lifetime")
for n in Iterators.reverse(prev_nws[max(end-lifetime+2,1):end])
i in _PM.ids(pm, n, :convdc_ne) ? push!(investment_horizon, n) : break
end
constraint_ne_converter_activation(pm, nw, i, investment_horizon)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 21221 | # Manage and access multinetwork dimensions
## Dimension data structure
function _initialize_dim()
Dict{Symbol,Any}(
:pos => NamedTuple(), # Position (order) of each dimension
:prop => Dict{Symbol,Dict{Int,Dict{String,Any}}}(), # Data relating to the elements of the dimensions
:meta => Dict{Symbol,Any}(), # Data relating to dimensions
:offset => 0 # Offset of nw ids: the id of the first nw is offset+1
# li (linear indices): generated by add_dimension!()
# ci (cartesian indices): generated by add_dimension!()
)
end
"""
add_dimension!(data, name, properties; metadata)
Add dimension `name` to `data` specifying some `properties` of the dimension ids.
# Arguments
- `data::Dict{String,Any}`: a single-network data dictionary.
- `name::Symbol`: the name to use to refer to the dimension in the code.
- `properties::Dict{Int,Dict{String,Any}}`: properties associated to the dimension ids. The
outer dictionary is indexed with the ids along the dimension (consecutive `Int`s starting
from 1). The inner dictionaries, one for each id, store the properties.
- `metadata::Dict{String,Any} = Dict{String,Any}()`: optional metadata describing the
dimension as a whole.
# Examples
```julia-repl
julia> add_dimension!(data, :scenario, Dict(s => Dict{String,Any}("probability"=>1/4) for s in 1:4))
```
# Extended help
The functions `dim_prop`, `dim_meta` and `dim_length` can be used to access properties,
metadata and length (cardinality) of a dimension. They apply to both data dictionaries
(`Dict{String,Any}`) and powermodels (`PowerModels.AbstractPowerModel`).
"""
function add_dimension!(data::Dict{String,Any}, name::Symbol, properties::Dict{Int,Dict{String,Any}}; metadata::Dict{String,Any}=Dict{String,Any}())
dim = get!(data, "dim", _initialize_dim())
if haskey(dim[:pos], name)
Memento.error(_LOGGER, "A dimension named \"$name\" is already present in data.")
end
if Set(keys(properties)) != Set(1:length(properties))
Memento.error(_LOGGER, "Keys of `properties` Dict must range from 1 to the number of $(name)s.")
end
dim[:pos] = (; dim[:pos]..., name => length(dim[:pos])+1)
dim[:prop][name] = properties
dim[:meta][name] = metadata
dim[:li] = LinearIndices(Tuple(1:length(dim[:prop][nm]) for nm in keys(dim[:pos]))).+dim[:offset]
dim[:ci] = CartesianIndices(dim[:li])
return dim
end
"""
add_dimension!(data, name, size; metadata)
Add dimension `name` to `data` specifying the dimension `size`.
# Examples
```julia-repl
julia> add_dimension!(data, :hour, 24)
```
"""
function add_dimension!(data::Dict{String,Any}, name::Symbol, size::Int; metadata::Dict{String,Any}=Dict{String,Any}())
properties = Dict{Int,Dict{String,Any}}(i => Dict{String,Any}() for i in 1:size)
add_dimension!(data, name, properties; metadata)
end
"""
shift_ids!(dim::Dict{Symbol,Any}, offset)
shift_ids!(data::Dict{String,Any}, offset)
Shift by `offset` the network ids in `dim` or `data`.
The `offset` argument is added to the existing offset.
Return a vector containing the shifted network ids.
`data` must be a single-network `data` dictionary.
"""
function shift_ids! end
function shift_ids!(dim::Dict{Symbol,Any}, offset::Int)
dim[:offset] += offset
dim[:li] .+= offset
return vec(dim[:li])
end
function shift_ids!(data::Dict{String,Any}, offset::Int)
if _IM.ismultinetwork(data)
Memento.error(_LOGGER, "`shift_ids!` can only be applied to single-network data dictionaries.")
end
shift_ids!(dim(data), offset)
end
"""
merge_dim!(dim1, dim2, dimension)
Merge `dim1` and `dim2` structures along `dimension`.
The ids of `dim2` must be contiguous to those of `dim1`.
"""
function merge_dim!(dim1::Dict{Symbol,Any}, dim2::Dict{Symbol,Any}, dimension::Symbol)
dim = Dict{Symbol,Any}()
if dim1[:pos] != dim2[:pos]
Memento.error(_LOGGER, "The dimensions to be merged have different names and/or order:\nfirst: $(dim1[:pos])\nsecond: $(dim1[:pos])")
end
dim[:pos] = dim1[:pos]
if any(dim1[:prop][d] != dim2[:prop][d] for d in delete!(Set(keys(dim[:pos])), dimension))
diff = join(", $d" for d in delete!(Set(keys(dim[:pos])), dimension) if dim1[:prop][d] != dim2[:prop][d])[2:end]
Memento.error(_LOGGER, "Different properties found in the following dimension(s):$diff.")
end
dim[:prop] = dim1[:prop]
offset = length(dim1[:prop][dimension])
for (k,v) in dim2[:prop][dimension]
dim[:prop][dimension][k+offset] = deepcopy(v)
end
if dim1[:meta] != dim2[:meta]
diff = join(", $d" for d in keys(dim[:pos]) if dim1[:meta][d] != dim2[:meta][d])[2:end]
Memento.error(_LOGGER, "Different metadata found in the following dimension(s):$diff.")
end
dim[:meta] = dim1[:meta]
if dim2[:li][1] != dim1[:li][end]+1
Memento.error(_LOGGER, "Multinetworks to be merged must have contiguous ids.")
end
dim[:offset] = min(dim1[:offset], dim2[:offset])
dim[:li] = LinearIndices(Tuple(1:length(dim[:prop][nm]) for nm in keys(dim[:pos]))).+dim[:offset]
dim[:ci] = CartesianIndices(dim[:li])
return dim
end
"""
slice, ids = slice_dim(dim::Dict{Symbol,Any}; kwargs...)
Slice `dim` structure keeping the networks that have the coordinates specified by `kwargs`.
`kwargs` must be of the form `name = <value>`, where `name` is the name of a dimension of
`dim` and `<value>` is an `Int` coordinate of that dimension.
Return `slice`, a sliced `dim` structure whose networks have ids starting at 1, and `ids`, a
vector containing the ids that the networks making up `slice` have in `dim`.
The coordinates of the dimensions at which `dim` is sliced are accessible with
`dim_meta(slice, <name>, "orig_id")`, where `<name>` is the name of one of those dimensions.
# Examples
```julia-repl
julia> slice_dim(dim; hour = 24)
julia> slice_dim(dim; hour = 24, scenario = 3)
```
"""
function slice_dim(dim::Dict{Symbol,Any}; kwargs...)
slice = Dict{Symbol,Any}()
slice[:pos] = dim[:pos]
slice[:prop] = Dict{Symbol,Dict{Int,Dict{String,Any}}}()
for d in keys(dim[:pos])
if d ∈ keys(kwargs)
slice[:prop][d] = Dict{Int,Dict{String,Any}}(1 => deepcopy(dim[:prop][d][kwargs[d]]))
else
slice[:prop][d] = deepcopy(dim[:prop][d])
end
end
slice[:meta] = deepcopy(dim[:meta])
for (d, i) in kwargs
slice[:meta][d]["orig_id"] = i
end
slice[:offset] = 0
slice[:li] = collect(LinearIndices(Tuple(1:length(slice[:prop][nm]) for nm in keys(slice[:pos]))))
slice[:ci] = CartesianIndices(slice[:li])
names = keys(dim[:pos])
li = dim[:li]
ids = vec(li[ntuple(i -> get(kwargs, names[i], axes(li,i)), ndims(li))...])
return slice, ids
end
## Access (subsets of) nw ids
"""
nw_ids(pm::PowerModels.AbstractPowerModel; kwargs...)
nw_ids(data::Dict{String,Any}; kwargs...)
nw_ids(dim::Dict{Symbol,Any}; kwargs...)
Sorted vector containing nw ids of `pm`, `data`, or `dim`, optionally filtered by the coordinates of one or more dimensions.
`kwargs` must be of the form `name = <value>` or `name = <interval>` or `name = <subset>`,
where `name` is the name of a dimension of `pm`.
# Examples
```julia-repl
julia> nw_ids(pm)
julia> nw_ids(pm; hour = 24)
julia> nw_ids(pm; hour = 13:24)
julia> nw_ids(pm; hour = [6,12,18,24])
julia> nw_ids(pm; hour = 24, scenario = 3)
```
"""
function nw_ids end
function nw_ids(dim::Dict{Symbol,Any}; kwargs...)::Vector{Int}
names = keys(dim[:pos])
li = dim[:li]
nws = li[ntuple(i -> get(kwargs, names[i], axes(li, i)), ndims(li))...]
ndims(nws) >= 1 ? vec(nws) : [nws]
end
function nw_ids(data::Dict{String,Any}; kwargs...)::Vector{Int}
haskey(data, "dim") ? nw_ids(dim(data); kwargs...) : [0]
end
function nw_ids(pm::_PM.AbstractPowerModel; kwargs...)::Vector{Int}
haskey(pm.ref[:it][_PM.pm_it_sym], :dim) ? nw_ids(dim(pm); kwargs...) : [0]
end
## Compute nw ids given another nw id
"""
similar_ids(pm::PowerModels.AbstractPowerModel, n::Int; kwargs...)
similar_ids(data::Dict{String,Any}, n::Int; kwargs...)
similar_ids(dim::Dict{Symbol,Any}, n::Int; kwargs...)
Sorted vector containing nw ids that have the same coordinates along dimensions as `n` except for dimensions passed in `kwargs`.
`kwargs` must be of the form `name = <value>` or `name = <interval>` or `name = <subset>`,
where `name` is the name of a dimension of `pm`, `data` or `dim`.
# Examples
```julia-repl
julia> similar_ids(pm, 3; hour = 24)
julia> similar_ids(pm, 3; hour = 13:24)
julia> similar_ids(pm, 3; hour = [6,12,18,24])
julia> similar_ids(pm, 3; hour = 24, scenario = 3)
```
"""
function similar_ids end
function similar_ids(dim::Dict{Symbol,Any}, n::Int; kwargs...)::Vector{Int}
names = keys(dim[:pos])
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
nws = li[ntuple(i -> get(kwargs, names[i], ci_n[i]), ndims(li))...]
ndims(nws) >= 1 ? vec(nws) : [nws]
end
similar_ids(data::Dict{String,Any}, n::Int; kwargs...) = similar_ids(dim(data), n; kwargs...)
similar_ids(pm::_PM.AbstractPowerModel, n::Int; kwargs...) = similar_ids(dim(pm), n; kwargs...)
"""
similar_id(pm::PowerModels.AbstractPowerModel, n::Int; kwargs...)
similar_id(data::Dict{String,Any}, n::Int; kwargs...)
similar_id(dim::Dict{Symbol,Any}, n::Int; kwargs...)
Nw id that has the same coordinates along dimensions as `n` except for dimensions passed in `kwargs`.
`kwargs` must be of the form `name = <value>`, where `name` is the name of a dimension of
`pm`, `data` or `dim`.
# Examples
```julia-repl
julia> similar_id(pm, 3; hour = 24)
julia> similar_id(pm, 3; hour = 24, scenario = 3)
```
"""
function similar_id end
function similar_id(dim::Dict{Symbol,Any}, n::Int; kwargs...)::Int
names = keys(dim[:pos])
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
li[ntuple(i -> get(kwargs, names[i], ci_n[i])::Int, ndims(li))...]
end
similar_id(data::Dict{String,Any}, n::Int; kwargs...) = similar_id(dim(data), n; kwargs...)
similar_id(pm::_PM.AbstractPowerModel, n::Int; kwargs...) = similar_id(dim(pm), n; kwargs...)
"""
first_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol...)
first_id(data::Dict{String,Any}, n::Int, dimension::Symbol...)
first_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
Return the first network in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function first_id end
function first_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
names = keys(dim[:pos])
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
li[ntuple(i -> names[i] in dimension ? 1 : ci_n[i], ndims(li))...]
end
first_id(data::Dict{String,Any}, n::Int, dimension::Symbol...) = first_id(dim(data), n, dimension...)
first_id(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol...) = first_id(dim(pm), n, dimension...)
"""
last_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol...)
last_id(data::Dict{String,Any}, n::Int, dimension::Symbol...)
last_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
Return the last network in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function last_id end
function last_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
names = keys(dim[:pos])
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
li[ntuple(i -> names[i] in dimension ? size(li,i) : ci_n[i], ndims(li))...]
end
last_id(data::Dict{String,Any}, n::Int, dimension::Symbol...) = last_id(dim(data), n, dimension...)
last_id(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol...) = last_id(dim(pm), n, dimension...)
"""
prev_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol)
prev_id(data::Dict{String,Any}, n::Int, dimension::Symbol)
prev_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
Return the previous network in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function prev_id end
function prev_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)::Int
pos_d = dim[:pos][dimension]
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
li[ntuple(i -> i == pos_d ? ci_n[i]-1 : ci_n[i], ndims(li))...]
end
prev_id(data::Dict{String,Any}, n::Int, dimension::Symbol) = prev_id(dim(data), n, dimension)
prev_id(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol) = prev_id(dim(pm), n, dimension)
"""
prev_ids(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol)
prev_ids(data::Dict{String,Any}, n::Int, dimension::Symbol)
prev_ids(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
Return the previous networks in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function prev_ids end
function prev_ids(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)::Vector{Int}
pos_d = dim[:pos][dimension]
offset = dim[:offset]
li = dim[:li]
ci_n = Tuple(dim[:ci][n-offset])
li[CartesianIndex(ci_n[1:pos_d-1]), CartesianIndices((1:ci_n[pos_d]-1,)), CartesianIndex(ci_n[pos_d+1:end])]
end
prev_ids(data::Dict{String,Any}, n::Int, dimension::Symbol) = prev_ids(dim(data), n, dimension)
prev_ids(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol) = prev_ids(dim(pm), n, dimension)
"""
next_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol)
next_id(data::Dict{String,Any}, n::Int, dimension::Symbol)
next_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
Return the next network in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function next_id end
function next_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)::Int
pos_d = dim[:pos][dimension]
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
li[ntuple(i -> i == pos_d ? ci_n[i]+1 : ci_n[i], ndims(li))...]
end
next_id(data::Dict{String,Any}, n::Int, dimension::Symbol) = next_id(dim(data), n, dimension)
next_id(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol) = next_id(dim(pm), n, dimension)
"""
next_ids(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol)
next_ids(data::Dict{String,Any}, n::Int, dimension::Symbol)
next_ids(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
Return the next networks in `pm`, `data` or `dim` along `dimension` while keeping the other dimensions fixed.
"""
function next_ids end
function next_ids(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)::Vector{Int}
pos_d = dim[:pos][dimension]
offset = dim[:offset]
li = dim[:li]
ci_n = Tuple(dim[:ci][n-offset])
li[CartesianIndex(ci_n[1:pos_d-1]), CartesianIndices((ci_n[pos_d]+1:size(li,pos_d),)), CartesianIndex(ci_n[pos_d+1:end])]
end
next_ids(data::Dict{String,Any}, n::Int, dimension::Symbol) = next_ids(dim(data), n, dimension)
next_ids(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol) = next_ids(dim(pm), n, dimension)
## Query properties of nw ids
"""
coord(pm::_PM.AbstractPowerModel, n::Int, dimension::Symbol)
coord(data::Dict{String,Any}, n::Int, dimension::Symbol)
coord(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
Return the coordinate along `dimension` of nw `n` of `pm`, `data` or `dim`.
"""
function coord end
function coord(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol)
pos = dim[:pos]
offset = dim[:offset]
ci_n = dim[:ci][n-offset]
ci_n[pos[dimension]]
end
coord(data::Dict{String,Any}, args...) = coord(dim(data), args...)
coord(pm::_PM.AbstractPowerModel, args...) = coord(dim(pm), args...)
"""
is_first_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol...)
is_first_id(data::Dict{String,Any}, n::Int, dimension::Symbol...)
is_first_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
Return whether the network `n` is the first along `dimension` in `pm`, `data` or `dim`.
"""
function is_first_id end
function is_first_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
pos = dim[:pos]
offset = dim[:offset]
ci_n = dim[:ci][n-offset]
all(ci_n[pos[d]] == 1 for d in dimension)
end
is_first_id(data::Dict{String,Any}, args...) = is_first_id(dim(data), args...)
is_first_id(pm::_PM.AbstractPowerModel, args...) = is_first_id(dim(pm), args...)
"""
is_last_id(pm::PowerModels.AbstractPowerModel, n::Int, dimension::Symbol...)
is_last_id(data::Dict{String,Any}, n::Int, dimension::Symbol...)
is_last_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
Return whether the network `n` is the last along `dimension` in `pm`, `data` or `dim`.
"""
function is_last_id(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol...)
pos = dim[:pos]
offset = dim[:offset]
li = dim[:li]
ci_n = dim[:ci][n-offset]
all(ci_n[pos[d]] == size(li,pos[d]) for d in dimension)
end
is_last_id(data::Dict{String,Any}, args...) = is_last_id(dim(data), args...)
is_last_id(pm::_PM.AbstractPowerModel, args...) = is_last_id(dim(pm), args...)
## Access data relating to dimensions
"""
dim(data::Dict{String,Any})
dim(pm::PowerModels.AbstractPowerModel)
Return `dim` data structure.
"""
function dim end
dim(data::Dict{String,Any}) = data["dim"]
dim(pm::_PM.AbstractPowerModel) = pm.ref[:it][_PM.pm_it_sym][:dim]
"""
has_dim(dim::Dict{Symbol,Any}, dimension)
has_dim(data::Dict{String,Any}, dimension)
has_dim(pm::PowerModels.AbstractPowerModel, dimension)
Whether `dimension` is defined.
"""
function has_dim end
has_dim(dim::Dict{Symbol,Any}, dimension::Symbol) = haskey(dim[:prop], dimension)
has_dim(data::Dict{String,Any}, args...) = has_dim(dim(data), args...)
has_dim(pm::_PM.AbstractPowerModel, args...) = has_dim(dim(pm), args...)
"""
require_dim(data, dimensions...)
Verify that the specified `dimensions` are present in `data`; if not, raise an error.
"""
function require_dim(data::Dict{String,Any}, dimensions::Symbol...)
if !haskey(data, "dim")
Memento.error(_LOGGER, "Missing `dim` dict in `data`. Use `add_dimension!` to fix.")
end
for d in dimensions
if !haskey(dim(data)[:prop], d)
Memento.error(_LOGGER, "Missing dimension \"$d\" in `data`. Use `add_dimension!` to fix.")
end
end
end
"""
dim_names(dim::Dict{Symbol,Any})
dim_names(data::Dict{String,Any})
dim_names(pm::PowerModels.AbstractPowerModel)
Names of the defined dimensions, as `Tuple` of `Symbol`s.
"""
function dim_names end
dim_names(dim::Dict{Symbol,Any}) = keys(dim[:pos])
dim_names(data::Dict{String,Any}) = dim_names(dim(data))
dim_names(pm::_PM.AbstractPowerModel) = dim_names(dim(pm))
"""
dim_prop(dim::Dict{Symbol,Any}[, dimension[, id[, key]]])
dim_prop(data::Dict{String,Any}[, dimension[, id[, key]]])
dim_prop(pm::PowerModels.AbstractPowerModel[, dimension[, id[, key]]])
Properties associated to the `id`s of `dimension`.
dim_prop(dim::Dict{Symbol,Any}, n, dimension[, key])
dim_prop(data::Dict{String,Any}, n, dimension[, key])
dim_prop(pm::PowerModels.AbstractPowerModel, n, dimension[, key])
Properties associated to `dimension` of a network `n`.
"""
function dim_prop end
dim_prop(dim::Dict{Symbol,Any}) = dim[:prop]
dim_prop(dim::Dict{Symbol,Any}, dimension::Symbol) = dim[:prop][dimension]
dim_prop(dim::Dict{Symbol,Any}, dimension::Symbol, id::Int) = dim[:prop][dimension][id]
dim_prop(dim::Dict{Symbol,Any}, dimension::Symbol, id::Int, key::String) = dim[:prop][dimension][id][key]
dim_prop(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol) = dim[:prop][dimension][coord(dim,n,dimension)]
dim_prop(dim::Dict{Symbol,Any}, n::Int, dimension::Symbol, key::String) = dim[:prop][dimension][coord(dim,n,dimension)][key]
dim_prop(data::Dict{String,Any}, args...) = dim_prop(dim(data), args...)
dim_prop(pm::_PM.AbstractPowerModel, args...) = dim_prop(dim(pm), args...)
"""
dim_meta(dim::Dict{Symbol,Any}[, dimension[, key]])
dim_meta(data::Dict{String,Any}[, dimension[, key]])
dim_meta(pm::PowerModels.AbstractPowerModel[, dimension[, key]])
Metadata associated to `dimension`.
"""
function dim_meta end
dim_meta(dim::Dict{Symbol,Any}) = dim[:meta]
dim_meta(dim::Dict{Symbol,Any}, dimension::Symbol) = dim[:meta][dimension]
dim_meta(dim::Dict{Symbol,Any}, dimension::Symbol, key::String) = dim[:meta][dimension][key]
dim_meta(data::Dict{String,Any}, args...) = dim_meta(dim(data), args...)
dim_meta(pm::_PM.AbstractPowerModel, args...) = dim_meta(dim(pm), args...)
"""
dim_length(dim::Dict{Symbol,Any}[, dimension])
dim_length(data::Dict{String,Any}[, dimension])
dim_length(pm::PowerModels.AbstractPowerModel[, dimension])
Return the number of networks or, if `dimension` is specified, return its size.
"""
function dim_length end
dim_length(dim::Dict{Symbol,Any}) = length(dim[:li])
dim_length(dim::Dict{Symbol,Any}, dimension::Symbol) = size(dim[:li], dim[:pos][dimension])
dim_length(data::Dict{String,Any}, args...) = dim_length(dim(data), args...)
dim_length(pm::_PM.AbstractPowerModel, args...) = dim_length(dim(pm), args...)
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 13144 | # Contains functions related to distribution networks but not specific to a particular model
## Lookup functions, to build the constraint selection logic
"Return whether the `f_bus` of branch `i` is the reference bus."
function is_frb_branch(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
return haskey(_PM.ref(pm, nw, :frb_branch), i)
end
"Return whether the `f_bus` of ne_branch `i` is the reference bus."
function is_frb_ne_branch(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
return haskey(_PM.ref(pm, nw, :frb_ne_branch), i)
end
"Return whether branch `i` is an OLTC."
function is_oltc_branch(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
return haskey(_PM.ref(pm, nw, :oltc_branch), i)
end
"Return whether ne_branch `i` is an OLTC."
function is_oltc_ne_branch(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
return haskey(_PM.ref(pm, nw, :oltc_ne_branch), i)
end
## Variables
function variable_oltc_branch_transform(pm::_PM.AbstractWModels; kwargs...)
variable_oltc_branch_transform_magnitude_sqr_inv(pm; kwargs...)
end
"variable: `0 <= ttmi[l]` for `l` in `oltc_branch`es"
function variable_oltc_branch_transform_magnitude_sqr_inv(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
ttmi = _PM.var(pm, nw)[:ttmi] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :oltc_branch)], base_name="$(nw)_ttmi",
lower_bound = 0.0,
start = 1.0 / _PM.ref(pm,nw,:oltc_branch,i,"tap")^2
)
if bounded
for (i, br) in _PM.ref(pm, nw, :oltc_branch)
JuMP.set_lower_bound(ttmi[i], 1.0 / br["tm_max"]^2 )
JuMP.set_upper_bound(ttmi[i], 1.0 / br["tm_min"]^2 )
end
end
report && _PM.sol_component_value(pm, nw, :branch, :ttmi, _PM.ids(pm, nw, :oltc_branch), ttmi)
end
function variable_oltc_ne_branch_transform(pm::_PM.AbstractWModels; kwargs...)
variable_oltc_ne_branch_transform_magnitude_sqr_inv(pm; kwargs...)
end
"variable: `0 <= ttmi_ne[l]` for `l` in `oltc_ne_branch`es"
function variable_oltc_ne_branch_transform_magnitude_sqr_inv(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
ttmi_ne = _PM.var(pm, nw)[:ttmi_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :oltc_ne_branch)], base_name="$(nw)_ttmi_ne",
lower_bound = 0.0,
start = 1.0 / _PM.ref(pm,nw,:oltc_ne_branch,i,"tap")^2
)
if bounded
for (i, br) in _PM.ref(pm, nw, :oltc_ne_branch)
JuMP.set_lower_bound(ttmi_ne[i], 1.0 / br["tm_max"]^2 )
JuMP.set_upper_bound(ttmi_ne[i], 1.0 / br["tm_min"]^2 )
end
end
report && _PM.sol_component_value(pm, nw, :ne_branch, :ttmi, _PM.ids(pm, nw, :oltc_ne_branch), ttmi_ne)
end
## Constraint templates that group several other constraint templates, provided for convenience
function constraint_dist_branch_tnep(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
if isempty(ne_branch_ids(pm, i; nw = nw))
if is_frb_branch(pm, i; nw = nw)
if is_oltc_branch(pm, i; nw = nw)
constraint_power_losses_oltc(pm, i; nw = nw)
constraint_voltage_magnitude_difference_oltc(pm, i; nw = nw)
else
constraint_power_losses_frb(pm, i; nw = nw)
constraint_voltage_magnitude_difference_frb(pm, i; nw = nw)
end
else
_PM.constraint_power_losses(pm, i; nw = nw)
_PM.constraint_voltage_magnitude_difference(pm, i; nw = nw)
end
_PM.constraint_voltage_angle_difference(pm, i; nw = nw)
_PM.constraint_thermal_limit_from(pm, i; nw = nw)
_PM.constraint_thermal_limit_to(pm, i; nw = nw)
else
expression_branch_indicator(pm, i; nw = nw)
# constraint_branch_complementarity(pm, i; nw = nw) is best added once per year; if added here, redundant constraints would be generated
if is_frb_branch(pm, i; nw = nw)
if is_oltc_branch(pm, i; nw = nw)
constraint_power_losses_oltc_on_off(pm, i; nw = nw)
constraint_voltage_magnitude_difference_oltc_on_off(pm, i; nw = nw)
else
constraint_power_losses_frb_on_off(pm, i; nw = nw)
constraint_voltage_magnitude_difference_frb_on_off(pm, i; nw = nw)
end
else
constraint_power_losses_on_off(pm, i; nw = nw)
constraint_voltage_magnitude_difference_on_off(pm, i; nw = nw)
end
_PM.constraint_voltage_angle_difference_on_off(pm, i; nw = nw)
_PM.constraint_thermal_limit_from_on_off(pm, i; nw = nw)
_PM.constraint_thermal_limit_to_on_off(pm, i; nw = nw)
end
end
function constraint_dist_ne_branch_tnep(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
if ne_branch_replace(pm, i, nw = nw)
if is_frb_ne_branch(pm, i, nw = nw)
if is_oltc_ne_branch(pm, i, nw = nw)
constraint_ne_power_losses_oltc(pm, i, nw = nw)
constraint_ne_voltage_magnitude_difference_oltc(pm, i, nw = nw)
else
constraint_ne_power_losses_frb(pm, i, nw = nw)
constraint_ne_voltage_magnitude_difference_frb(pm, i, nw = nw)
end
else
constraint_ne_power_losses(pm, i, nw = nw)
constraint_ne_voltage_magnitude_difference(pm, i, nw = nw)
end
_PM.constraint_ne_thermal_limit_from(pm, i, nw = nw)
_PM.constraint_ne_thermal_limit_to(pm, i, nw = nw)
else
if is_frb_ne_branch(pm, i, nw = nw)
if is_oltc_ne_branch(pm, i, nw = nw)
Memento.error(_LOGGER, "addition of a candidate OLTC in parallel to an existing OLTC is not supported")
else
constraint_ne_power_losses_frb_parallel(pm, i, nw = nw)
constraint_ne_voltage_magnitude_difference_frb_parallel(pm, i, nw = nw)
end
else
constraint_ne_power_losses_parallel(pm, i, nw = nw)
constraint_ne_voltage_magnitude_difference_parallel(pm, i, nw = nw)
end
constraint_ne_thermal_limit_from_parallel(pm, i, nw = nw)
constraint_ne_thermal_limit_to_parallel(pm, i, nw = nw)
end
_PM.constraint_ne_voltage_angle_difference(pm, i, nw = nw)
end
## Constraint templates
"Defines voltage drop over a a branch whose `f_bus` is the reference bus"
function constraint_voltage_magnitude_difference_frb(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
constraint_voltage_magnitude_difference_frb(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
end
"Defines voltage drop over a transformer branch that has an OLTC"
function constraint_voltage_magnitude_difference_oltc(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
constraint_voltage_magnitude_difference_oltc(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr)
end
"Defines branch flow model power flow equations for a branch whose `f_bus` is the reference bus"
function constraint_power_losses_frb(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
constraint_power_losses_frb(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm)
end
"Defines branch flow model power flow equations for a transformer branch that has an OLTC"
function constraint_power_losses_oltc(pm::_PM.AbstractBFModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
constraint_power_losses_oltc(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to)
end
""
function constraint_ne_power_losses_frb(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_ne_power_losses_frb(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
""
function constraint_ne_power_losses_oltc(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_ne_power_losses_oltc(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, vad_min, vad_max)
end
"""
Defines voltage drop over a branch, linking from and to side voltage magnitude
"""
function constraint_ne_voltage_magnitude_difference_frb(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
constraint_ne_voltage_magnitude_difference_frb(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
end
"""
Defines voltage drop over a branch, linking from and to side voltage magnitude
"""
function constraint_ne_voltage_magnitude_difference_oltc(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm_min = branch["tm_min"]
tm_max = branch["tm_max"]
constraint_ne_voltage_magnitude_difference_oltc(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm_min, tm_max)
end
## Constraint implementations not limited to a specific model type
""
function constraint_ne_voltage_magnitude_difference_frb(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
branch = _PM.ref(pm, n, :ne_branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = 1.0^2/tm^2 - to_bus["vmin"]^2
M_lo = -1.0^2/tm^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p_ne, f_idx)
q_fr = _PM.var(pm, n, :q_ne, f_idx)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :branch_ne, i)
# w_fr is assumed equal to 1.0
JuMP.@constraint(pm.model, (1.0/tm^2) - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, (1.0/tm^2) - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
""
function constraint_ne_voltage_magnitude_difference_oltc(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm_min, tm_max)
branch = _PM.ref(pm, n, :ne_branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = 1.0^2/tm_min^2 - to_bus["vmin"]^2
M_lo = -1.0^2/tm_max^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p_ne, f_idx)
q_fr = _PM.var(pm, n, :q_ne, f_idx)
ttmi = _PM.var(pm, n, :ttmi_ne, i)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :branch_ne, i)
# w_fr is assumed equal to 1.0 to preserve the linearity of the model
JuMP.@constraint(pm.model, 1.0*ttmi - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, 1.0*ttmi - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 23203 | # Variables and constraints related to flexible loads
## Variables
function variable_flexible_demand(pm::_PM.AbstractPowerModel; investment::Bool=true, kwargs...)
variable_total_flex_demand(pm; kwargs...)
variable_demand_reduction(pm; kwargs...)
variable_demand_shifting_upwards(pm; kwargs...)
variable_demand_shifting_downwards(pm; kwargs...)
variable_demand_curtailment(pm; kwargs...)
variable_flexible_demand_indicator(pm; kwargs..., relax=true)
investment && variable_flexible_demand_investment(pm; kwargs...)
end
"Variable: whether flexible demand is enabled at a flex load point"
function variable_flexible_demand_indicator(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
z = _PM.var(pm, nw)[:z_flex] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_z_flex",
binary = true,
start = 0
)
else
z = _PM.var(pm, nw)[:z_flex] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_z_flex",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
z = _PM.var(pm, nw)[:z_flex] = _PM.var(pm, first_n)[:z_flex]
end
if report
_PM.sol_component_value(pm, nw, :load, :flex, _PM.ids(pm, nw, :flex_load), z)
_PM.sol_component_fixed(pm, nw, :load, :flex, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable: investment decision to enable flexible demand at a flex load point"
function variable_flexible_demand_investment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
investment = _PM.var(pm, nw)[:z_flex_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_z_flex_investment",
binary = true,
start = 0
)
else
investment = _PM.var(pm, nw)[:z_flex_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_z_flex_investment",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
investment = _PM.var(pm, nw)[:z_flex_investment] = _PM.var(pm, first_n)[:z_flex_investment]
end
if report
_PM.sol_component_value(pm, nw, :load, :investment, _PM.ids(pm, nw, :flex_load), investment)
_PM.sol_component_fixed(pm, nw, :load, :investment, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
function variable_total_flex_demand(pm::_PM.AbstractPowerModel; kwargs...)
variable_total_flex_demand_active(pm; kwargs...)
variable_total_flex_demand_reactive(pm; kwargs...)
end
"Variable for the actual (flexible) real load demand at each load point and each time step"
function variable_total_flex_demand_active(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
pflex = _PM.var(pm, nw)[:pflex] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :load)], base_name="$(nw)_pflex",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :load, i, "pd") * (1 + get(_PM.ref(pm, nw, :load, i), "pshift_up_rel_max", 0.0)), # Not strictly necessary: redundant due to other bounds
start = _PM.comp_start_value(_PM.ref(pm, nw, :load, i), "pd")
)
report && _PM.sol_component_value(pm, nw, :load, :pflex, _PM.ids(pm, nw, :load), pflex)
end
function variable_total_flex_demand_reactive(pm::_PM.AbstractActivePowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
end
"Variable for the actual (flexible) reactive load demand at each load point and each time step"
function variable_total_flex_demand_reactive(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
qflex = _PM.var(pm, nw)[:qflex] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :load)], base_name="$(nw)_qflex",
start = _PM.comp_start_value(_PM.ref(pm, nw, :load, i), "qd")
)
report && _PM.sol_component_value(pm, nw, :load, :qflex, _PM.ids(pm, nw, :load), qflex)
end
"Variable for load curtailment (i.e. involuntary demand reduction) at each load point and each time step"
function variable_demand_curtailment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
pcurt = _PM.var(pm, nw)[:pcurt] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :load)], base_name="$(nw)_pcurt",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :load, i, "pd"),
start = 0
)
report && _PM.sol_component_value(pm, nw, :load, :pcurt, _PM.ids(pm, nw, :load), pcurt)
end
"Variable for the power not consumed (voluntary load reduction) at each flex load point and each time step"
function variable_demand_reduction(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
# This is bounded for each time step by a fixed share (0 ≤ pred_rel_max ≤ 1) of the
# reference load demand pd for that time step. (Thus, while pred_rel_max is a scalar
# input parameter, the variable bounds become a time series.)
pred = _PM.var(pm, nw)[:pred] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_pred",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :load, i, "pd") * _PM.ref(pm, nw, :flex_load, i, "pred_rel_max"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :pred, _PM.ids(pm, nw, :flex_load), pred)
_PM.sol_component_fixed(pm, nw, :load, :pred, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable for keeping track of the energy not consumed (i.e. the accumulated voluntary load reduction) over the operational planning horizon at each flex load point"
function variable_energy_not_consumed(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
first_nw = first_id(pm, nw, :hour)
ered = _PM.var(pm, nw)[:ered] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_ered",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :flex_load, i, "ered_rel_max") * _PM.ref(pm, first_nw, :flex_load, i, "ed"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :ered, _PM.ids(pm, nw, :flex_load), ered)
_PM.sol_component_fixed(pm, nw, :load, :ered, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable for the upward demand shifting at each flex load point and each time step"
function variable_demand_shifting_upwards(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
# This is bounded for each time step by a fixed share (0 ≤ pshift_up_rel_max ≤ 1) of the
# reference load demand pd for that time step. (Thus, while pshift_up_rel_max is a
# scalar input parameter, the variable bounds become a time series.)
pshift_up = _PM.var(pm, nw)[:pshift_up] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_pshift_up",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :load, i, "pd") * _PM.ref(pm, nw, :flex_load, i, "pshift_up_rel_max"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :pshift_up, _PM.ids(pm, nw, :flex_load), pshift_up)
_PM.sol_component_fixed(pm, nw, :load, :pshift_up, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable for keeping track of the accumulated upward demand shifting over the operational planning horizon at each flex_load point"
function variable_total_demand_shifting_upwards(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
first_nw = first_id(pm, nw, :hour)
eshift_up = _PM.var(pm, nw)[:eshift_up] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_eshift_up",
lower_bound = 0,
# The accumulated load shifted up should equal the accumulated load shifted down, so this constraint is probably redundant
upper_bound = _PM.ref(pm, nw, :flex_load, i, "eshift_rel_max") * _PM.ref(pm, first_nw, :flex_load, i, "ed"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :eshift_up, _PM.ids(pm, nw, :flex_load), eshift_up)
_PM.sol_component_fixed(pm, nw, :load, :eshift_up, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable for the downward demand shifting at each flex load point and each time step"
function variable_demand_shifting_downwards(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
# This is bounded for each time step by a fixed share (0 ≤ pshift_down_rel_max ≤ 1) of
# the reference load demand pd for that time step. (Thus, while pshift_down_rel_max is a
# scalar input parameter, the variable bounds become a time series.)
pshift_down = _PM.var(pm, nw)[:pshift_down] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_pshift_down",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :load, i, "pd") * _PM.ref(pm, nw, :flex_load, i, "pshift_down_rel_max"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :pshift_down, _PM.ids(pm, nw, :flex_load), pshift_down)
_PM.sol_component_fixed(pm, nw, :load, :pshift_down, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
"Variable for keeping track of the accumulated upward demand shifting over the operational planning horizon at each flex load point"
function variable_total_demand_shifting_downwards(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
first_nw = first_id(pm, nw, :hour)
eshift_down = _PM.var(pm, nw)[:eshift_down] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :flex_load)], base_name="$(nw)_eshift_down",
lower_bound = 0,
upper_bound = _PM.ref(pm, nw, :flex_load, i, "eshift_rel_max") * _PM.ref(pm, first_nw, :flex_load, i, "ed"),
start = 0
)
if report
_PM.sol_component_value(pm, nw, :load, :eshift_down, _PM.ids(pm, nw, :flex_load), eshift_down)
_PM.sol_component_fixed(pm, nw, :load, :eshift_down, _PM.ids(pm, nw, :fixed_load), 0.0)
end
end
## Constraint templates
function constraint_flexible_demand_activation(pm::_PM.AbstractPowerModel, i::Int, prev_nws::Vector{Int}, nw::Int)
investment_horizon = [nw]
lifetime = _PM.ref(pm, nw, :load, i, "lifetime")
for n in Iterators.reverse(prev_nws[max(end-lifetime+2,1):end])
i in _PM.ids(pm, n, :load) ? push!(investment_horizon, n) : break
end
constraint_flexible_demand_activation(pm, nw, i, investment_horizon)
end
function constraint_flex_bounds_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
load = _PM.ref(pm, nw, :load, i)
constraint_flex_bounds_ne(pm, nw, i, load["pd"], load["pshift_up_rel_max"], load["pshift_down_rel_max"], load["pred_rel_max"])
end
function constraint_total_flexible_demand(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
load = _PM.ref(pm, nw, :load, i)
pd = load["pd"]
pf_angle = get(load, "pf_angle", 0.0) # Power factor angle, in radians
constraint_total_flexible_demand(pm, nw, i, pd, pf_angle)
end
function constraint_total_fixed_demand(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
load = _PM.ref(pm, nw, :load, i)
pd = load["pd"]
pf_angle = get(load, "pf_angle", 0.0) # Power factor angle, in radians
constraint_total_fixed_demand(pm, nw, i, pd, pf_angle)
end
function constraint_red_state(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_red_state_initial(pm, nw, i, time_elapsed)
end
function constraint_red_state(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_red_state(pm, nw_1, nw_2, i, time_elapsed)
end
function constraint_shift_duration(pm::_PM.AbstractPowerModel, i::Int, first_hour_nw::Int, nw::Int)
constraint_shift_duration_up(pm, i, first_hour_nw, nw)
constraint_shift_duration_down(pm, i, first_hour_nw, nw)
end
function constraint_shift_duration_up(pm::_PM.AbstractPowerModel, i::Int, first_hour_nw::Int, nw::Int)
load = _PM.ref(pm, nw, :load, i)
start_period = max(nw-load["tshift_up"], first_hour_nw)
constraint_shift_duration_up(pm, nw, i, load["pd"], load["pshift_up_rel_max"], start_period)
end
function constraint_shift_duration_down(pm::_PM.AbstractPowerModel, i::Int, first_hour_nw::Int, nw::Int)
load = _PM.ref(pm, nw, :load, i)
start_period = max(nw-load["tshift_down"], first_hour_nw)
constraint_shift_duration_down(pm, nw, i, load["pd"], load["pshift_down_rel_max"], start_period)
end
function constraint_shift_up_state(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_shift_up_state_initial(pm, nw, i, time_elapsed)
end
function constraint_shift_up_state(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_shift_up_state(pm, nw_1, nw_2, i, time_elapsed)
end
function constraint_shift_down_state(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_shift_down_state_initial(pm, nw, i, time_elapsed)
end
function constraint_shift_down_state(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_shift_down_state(pm, nw_1, nw_2, i, time_elapsed)
end
function constraint_shift_state_final(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
constraint_shift_state_final(pm, nw, i)
end
# This way of enforcing a balance between power shifted upward and power shifted downward:
# - does not use `eshift_up` and `eshift_down` variables;
# - is alternative to `constraint_shift_up_state`, `constraint_shift_down_state`, and
# `constraint_shift_state_final`.
# It must be called only on last hour nws.
function constraint_shift_balance_periodic(pm::_PM.AbstractPowerModel, i::Int, period::Int; nw::Int=_PM.nw_id_default)
timeseries_nw_ids = similar_ids(pm, nw, hour = 1:dim_length(pm,:hour))
time_elapsed = Int(_PM.ref(pm, nw, :time_elapsed))
if period % time_elapsed ≠ 0
Memento.error(_LOGGER, "\"period\" must be a multiple of \"time_elapsed\".")
end
for horizon in Iterators.partition(timeseries_nw_ids, period÷time_elapsed)
constraint_shift_balance_periodic(pm, horizon, i)
end
end
## Constraint implementations
function constraint_flexible_demand_activation(pm::_PM.AbstractPowerModel, n::Int, i::Int, horizon::Vector{Int})
indicator = _PM.var(pm, n, :z_flex, i)
investments = _PM.var.(Ref(pm), horizon, :z_flex_investment, i)
# Activate the flexibility depending on the investment decisions in the load's horizon.
JuMP.@constraint(pm.model, indicator == sum(investments))
end
function constraint_flex_bounds_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int, pd, pshift_up_rel_max, pshift_down_rel_max, pred_rel_max)
pshift_up = _PM.var(pm, n, :pshift_up, i)
pshift_down = _PM.var(pm, n, :pshift_down, i)
pred = _PM.var(pm, n, :pred, i)
z = _PM.var(pm, n, :z_flex, i)
# Bounds on the demand flexibility decision variables (demand shifting and voluntary load reduction)
JuMP.@constraint(pm.model, pshift_up <= pshift_up_rel_max * pd * z)
JuMP.@constraint(pm.model, pshift_down <= pshift_down_rel_max * pd * z)
JuMP.@constraint(pm.model, pred <= pred_rel_max * pd * z)
end
function constraint_total_flexible_demand(pm::_PM.AbstractPowerModel, n::Int, i, pd, pf_angle)
pflex = _PM.var(pm, n, :pflex, i)
qflex = _PM.var(pm, n, :qflex, i)
pcurt = _PM.var(pm, n, :pcurt, i)
pred = _PM.var(pm, n, :pred, i)
pshift_up = _PM.var(pm, n, :pshift_up, i)
pshift_down = _PM.var(pm, n, :pshift_down, i)
# Active power demand is the reference demand `pd` plus the contributions from all the demand flexibility decision variables
JuMP.@constraint(pm.model, pflex == pd - pcurt - pred + pshift_up - pshift_down)
# Reactive power demand is given by the active power demand and the power factor angle of the load
JuMP.@constraint(pm.model, qflex == tan(pf_angle) * pflex)
end
function constraint_total_flexible_demand(pm::_PM.AbstractActivePowerModel, n::Int, i, pd, pf_angle)
pflex = _PM.var(pm, n, :pflex, i)
pcurt = _PM.var(pm, n, :pcurt, i)
pred = _PM.var(pm, n, :pred, i)
pshift_up = _PM.var(pm, n, :pshift_up, i)
pshift_down = _PM.var(pm, n, :pshift_down, i)
# Active power demand is the reference demand `pd` plus the contributions from all the demand flexibility decision variables
JuMP.@constraint(pm.model, pflex == pd - pcurt - pred + pshift_up - pshift_down)
end
function constraint_total_fixed_demand(pm::_PM.AbstractPowerModel, n::Int, i, pd, pf_angle)
pflex = _PM.var(pm, n, :pflex, i)
qflex = _PM.var(pm, n, :qflex, i)
pcurt = _PM.var(pm, n, :pcurt, i)
# Active power demand is the difference between reference demand `pd` and involuntary curtailment
JuMP.@constraint(pm.model, pflex == pd - pcurt)
# Reactive power demand is given by the active power demand and the power factor angle of the load
JuMP.@constraint(pm.model, qflex == tan(pf_angle) * pflex)
end
function constraint_total_fixed_demand(pm::_PM.AbstractActivePowerModel, n::Int, i, pd, pf_angle)
pflex = _PM.var(pm, n, :pflex, i)
pcurt = _PM.var(pm, n, :pcurt, i)
# Active power demand is the difference between reference demand `pd` and involuntary curtailment
JuMP.@constraint(pm.model, pflex == pd - pcurt)
end
function constraint_red_state_initial(pm::_PM.AbstractPowerModel, n::Int, i::Int, time_elapsed)
pred = _PM.var(pm, n, :pred, i)
ered = _PM.var(pm, n, :ered, i)
# Initialization of not consumed energy variable (accumulated voluntary load reduction)
JuMP.@constraint(pm.model, ered == time_elapsed * pred)
end
function constraint_red_state(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, time_elapsed)
pred = _PM.var(pm, n_2, :pred, i)
ered_2 = _PM.var(pm, n_2, :ered, i)
ered_1 = _PM.var(pm, n_1, :ered, i)
# Accumulation of not consumed energy (accumulation of voluntary load reduction for each time step)
JuMP.@constraint(pm.model, ered_2 - ered_1 == time_elapsed * pred)
end
function constraint_shift_duration_up(pm::_PM.AbstractPowerModel, n::Int, i::Int, pd, pshift_up_rel_max, start_period::Int)
# Apply an upper bound to the demand shifted upward during the recovery period
JuMP.@constraint(pm.model, sum(_PM.var(pm, t, :pshift_up, i) for t in start_period:n) <= pshift_up_rel_max * pd)
end
function constraint_shift_duration_down(pm::_PM.AbstractPowerModel, n::Int, i::Int, pd, pshift_down_rel_max, start_period::Int)
# Apply an upper bound to the demand shifted downward during the recovery period
JuMP.@constraint(pm.model, sum(_PM.var(pm, t, :pshift_down, i) for t in start_period:n) <= pshift_down_rel_max * pd)
end
function constraint_shift_up_state_initial(pm::_PM.AbstractPowerModel, n::Int, i::Int, time_elapsed)
pshift_up = _PM.var(pm, n, :pshift_up, i)
eshift_up = _PM.var(pm, n, :eshift_up, i)
# Initialization of accumulated upward demand shifting variable
JuMP.@constraint(pm.model, eshift_up == time_elapsed * pshift_up)
end
function constraint_shift_up_state(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, time_elapsed)
pshift_up = _PM.var(pm, n_2, :pshift_up, i)
eshift_up_2 = _PM.var(pm, n_2, :eshift_up, i)
eshift_up_1 = _PM.var(pm, n_1, :eshift_up, i)
# Accumulation of upward demand shifting for each time step
JuMP.@constraint(pm.model, eshift_up_2 - eshift_up_1 == time_elapsed * pshift_up)
end
function constraint_shift_down_state_initial(pm::_PM.AbstractPowerModel, n::Int, i::Int, time_elapsed)
pshift_down = _PM.var(pm, n, :pshift_down, i)
eshift_down = _PM.var(pm, n, :eshift_down, i)
# Initialization of accumulated downward demand shifting variable
JuMP.@constraint(pm.model, eshift_down == time_elapsed * pshift_down)
end
function constraint_shift_down_state(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, time_elapsed)
pshift_down = _PM.var(pm, n_2, :pshift_down, i)
eshift_down_2 = _PM.var(pm, n_2, :eshift_down, i)
eshift_down_1 = _PM.var(pm, n_1, :eshift_down, i)
# Accumulation of downward demand shifting for each time step
JuMP.@constraint(pm.model, eshift_down_2 - eshift_down_1 == time_elapsed * pshift_down)
end
function constraint_shift_state_final(pm::_PM.AbstractPowerModel, n::Int, i::Int)
eshift_up = _PM.var(pm, n, :eshift_up, i)
eshift_down = _PM.var(pm, n, :eshift_down, i)
# The accumulated upward demand shifting over the operational planning horizon should equal the accumulated downward
# demand shifting (since this is demand shifted and not reduced or curtailed)
JuMP.@constraint(pm.model, eshift_up == eshift_down)
end
function constraint_shift_balance_periodic(pm::_PM.AbstractPowerModel, horizon::AbstractVector{Int}, i::Int)
pshift_up = _PM.var.(Ref(pm), horizon, :pshift_up, i)
pshift_down = _PM.var.(Ref(pm), horizon, :pshift_down, i)
JuMP.@constraint(pm.model, sum(pshift_up) == sum(pshift_down))
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 644 | # Dispatchable and non-dispatchable generators
## Expressions
"Curtailed power of a non-dispatchable generator as the difference between its reference power and the generated power."
function expression_gen_curtailment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, report::Bool=true)
pgcurt = _PM.var(pm, nw)[:pgcurt] = Dict{Int,Any}(
i => ndgen["pmax"] - _PM.var(pm,nw,:pg,i) for (i,ndgen) in _PM.ref(pm,nw,:ndgen)
)
if report
_PM.sol_component_fixed(pm, nw, :gen, :pgcurt, _PM.ids(pm, nw, :dgen), 0.0)
_PM.sol_component_value(pm, nw, :gen, :pgcurt, _PM.ids(pm, nw, :ndgen), pgcurt)
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 23857 | ## Expressions ##
### Expression templates ###
"Defines branch indicator as a function of corresponding ne_branch indicator variables."
function expression_branch_indicator(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
if !haskey(_PM.var(pm, nw), :z_branch)
_PM.var(pm, nw)[:z_branch] = Dict{Int,Any}()
end
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
expression_branch_indicator(pm, nw, br_idx, f_bus, t_bus)
end
### Actual expressions ###
function expression_branch_indicator(pm::_PM.AbstractPowerModel, n::Int, br_idx, f_bus, t_bus)
branch_ne_sum = sum(_PM.var(pm, n, :branch_ne, l) for l in _PM.ref(pm, n, :ne_buspairs, (f_bus,t_bus), "branches"))
_PM.var(pm, n, :z_branch)[br_idx] = 1 - branch_ne_sum
end
## Constraints ##
### Constraint templates ###
""
function constraint_ohms_yt_from_repl(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
g, b = _PM.calc_branch_y(branch)
tr, ti = _PM.calc_branch_t(branch)
g_fr = branch["g_fr"]
b_fr = branch["b_fr"]
tm = branch["tap"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
# track if a certain candidate branch is replacing a line
replace, ne_br_idx = replace_branch(pm, nw, f_bus, t_bus)
# If lines is to be repalced use formulations below, else use PowerModels constraint for existing branches
if replace == 0
_PM.constraint_ohms_yt_from(pm, nw, f_bus, t_bus, f_idx, t_idx, g, b, g_fr, b_fr, tr, ti, tm)
else
constraint_ohms_yt_from_repl(pm, nw, ne_br_idx, f_bus, t_bus, f_idx, b, vad_min, vad_max)
end
end
""
function constraint_ohms_yt_to_repl(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
g, b = _PM.calc_branch_y(branch)
tr, ti = _PM.calc_branch_t(branch)
g_to = branch["g_to"]
b_to = branch["b_to"]
tm = branch["tap"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
# track if a certain candidate branch is replacing a line
replace, ne_br_idx = replace_branch(pm, nw, f_bus, t_bus)
# If lines is to be repalced use formulations below, else use PowerModels constraint for existing branches
if replace == 0
_PM.constraint_ohms_yt_to(pm, nw, f_bus, t_bus, f_idx, t_idx, g, b, g_to, b_to, tr, ti, tm)
else
constraint_ohms_yt_to_repl(pm, nw, ne_br_idx, f_bus, t_bus, t_idx, b, vad_min, vad_max)
end
end
function constraint_voltage_angle_difference_repl(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, branch["f_bus"], branch["t_bus"])
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
# track if a certain candidate branch is replacing a line
replace, ne_br_idx = replace_branch(pm, nw, f_bus, t_bus)
# If lines is to be repalced use formulations below, else use PowerModels constraint for existing branches
if replace == 0
_PM.constraint_voltage_angle_difference(pm, nw, f_idx, branch["angmin"], branch["angmax"])
else
constraint_voltage_angle_difference_repl(pm, nw, ne_br_idx, f_idx, branch["angmin"], branch["angmax"], vad_min, vad_max)
end
end
function constraint_thermal_limit_from_repl(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
if !haskey(branch, "rate_a")
Memento.error(_LOGGER, "constraint_thermal_limit_from_ne requires a rate_a value on all branches, calc_thermal_limits! can be used to generate reasonable values")
end
# track if a certain candidate branch is replacing a line
replace, ne_br_idx = replace_branch(pm, nw, f_bus, t_bus)
# If lines is to be repalced use formulations below, else use PowerModels constraint for existing branches
if replace == 0
_PM.constraint_thermal_limit_from(pm, nw, f_idx, branch["rate_a"])
else
constraint_thermal_limit_from_repl(pm, nw, ne_br_idx, f_idx, branch["rate_a"])
end
end
""
function constraint_thermal_limit_to_repl(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
t_idx = (i, t_bus, f_bus)
if !haskey(branch, "rate_a")
Memento.error(_LOGGER, "constraint_thermal_limit_to_ne requires a rate_a value on all branches, calc_thermal_limits! can be used to generate reasonable values")
end
# track if a certain candidate branch is replacing a line
replace, ne_br_idx = replace_branch(pm, nw, f_bus, t_bus)
# If lines is to be repalced use formulations below, else use PowerModels constraint for existing branches
if replace == 0
_PM.constraint_thermal_limit_to(pm, nw, t_idx, branch["rate_a"])
else
constraint_thermal_limit_to_repl(pm, nw, ne_br_idx, t_idx, branch["rate_a"])
end
end
#### Constraint templates used in radial networks ####
"States that at most one of the ne_branches sharing the same bus pair must be built."
function constraint_branch_complementarity(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
constraint_branch_complementarity(pm, nw, br_idx, f_bus, t_bus)
end
""
function constraint_power_losses_on_off(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_power_losses_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
""
function constraint_power_losses_frb_on_off(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_power_losses_frb_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
""
function constraint_power_losses_oltc_on_off(pm::_PM.AbstractBFModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_power_losses_oltc_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, vad_min, vad_max)
end
""
function constraint_ne_power_losses_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int=_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
f_idx = (ne_br_idx, f_bus, t_bus)
t_idx = (ne_br_idx, t_bus, f_bus)
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
ne_r = ne_branch["br_r"]
ne_x = ne_branch["br_x"]
ne_tm = ne_branch["tap"]
ne_g_sh_fr = ne_branch["g_fr"]
ne_g_sh_to = ne_branch["g_to"]
ne_b_sh_fr = ne_branch["b_fr"]
ne_b_sh_to = ne_branch["b_to"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
if ne_tm != tm
Memento.error(_LOGGER, "ne_branch $(ne_br_idx) cannot be built in parallel to branch $(br_idx) because has a different tap ratio")
end
constraint_ne_power_losses_parallel(pm, nw, br_idx, ne_br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, ne_r, ne_x, ne_g_sh_fr, ne_g_sh_to, ne_b_sh_fr, ne_b_sh_to, tm, vad_min, vad_max)
end
""
function constraint_ne_power_losses_frb_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int=_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
f_idx = (ne_br_idx, f_bus, t_bus)
t_idx = (ne_br_idx, t_bus, f_bus)
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
ne_r = ne_branch["br_r"]
ne_x = ne_branch["br_x"]
ne_tm = ne_branch["tap"]
ne_g_sh_fr = ne_branch["g_fr"]
ne_g_sh_to = ne_branch["g_to"]
ne_b_sh_fr = ne_branch["b_fr"]
ne_b_sh_to = ne_branch["b_to"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
if ne_tm != tm
Memento.error(_LOGGER, "ne_branch $(ne_br_idx) cannot be built in parallel to branch $(br_idx) because has a different tap ratio")
end
constraint_ne_power_losses_frb_parallel(pm, nw, br_idx, ne_br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, ne_r, ne_x, ne_g_sh_fr, ne_g_sh_to, ne_b_sh_fr, ne_b_sh_to, tm, vad_min, vad_max)
end
""
function constraint_voltage_magnitude_difference_on_off(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
constraint_voltage_magnitude_difference_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
end
""
function constraint_voltage_magnitude_difference_frb_on_off(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
constraint_voltage_magnitude_difference_frb_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
end
""
function constraint_voltage_magnitude_difference_oltc_on_off(pm::_PM.AbstractBFModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (br_idx, f_bus, t_bus)
t_idx = (br_idx, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm_min = branch["tm_min"]
tm_max = branch["tm_max"]
constraint_voltage_magnitude_difference_oltc_on_off(pm, nw, br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm_min, tm_max)
end
""
function constraint_ne_voltage_magnitude_difference_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int =_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
f_idx = (ne_br_idx, f_bus, t_bus)
t_idx = (ne_br_idx, t_bus, f_bus)
ne_r = ne_branch["br_r"]
ne_x = ne_branch["br_x"]
ne_g_sh_fr = ne_branch["g_fr"]
ne_b_sh_fr = ne_branch["b_fr"]
ne_tm = ne_branch["tap"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
if is_oltc_branch(pm, br_idx, nw = nw)
Memento.error(_LOGGER, "ne_branch $ne_br_idx cannot be built in parallel to an OLTC (branch $br_idx)")
end
if ne_tm != tm
Memento.error(_LOGGER, "ne_branch $ne_br_idx cannot be built in parallel to branch $br_idx because has a different tap ratio")
end
constraint_ne_voltage_magnitude_difference_parallel(pm, nw, br_idx, ne_br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, ne_r, ne_x, ne_g_sh_fr, ne_b_sh_fr, tm)
end
""
function constraint_ne_voltage_magnitude_difference_frb_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int =_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
f_idx = (ne_br_idx, f_bus, t_bus)
t_idx = (ne_br_idx, t_bus, f_bus)
ne_r = ne_branch["br_r"]
ne_x = ne_branch["br_x"]
ne_g_sh_fr = ne_branch["g_fr"]
ne_b_sh_fr = ne_branch["b_fr"]
ne_tm = ne_branch["tap"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
if is_oltc_branch(pm, br_idx, nw = nw)
Memento.error(_LOGGER, "ne_branch $ne_br_idx cannot be built in parallel to an OLTC (branch $br_idx)")
end
if ne_tm != tm
Memento.error(_LOGGER, "ne_branch $ne_br_idx cannot be built in parallel to branch $br_idx because has a different tap ratio")
end
constraint_ne_voltage_magnitude_difference_frb_parallel(pm, nw, br_idx, ne_br_idx, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, ne_r, ne_x, ne_g_sh_fr, ne_b_sh_fr, tm)
end
""
function constraint_ne_thermal_limit_from_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int=_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
f_idx = (ne_br_idx, f_bus, t_bus)
if !haskey(ne_branch, "rate_a")
Memento.error(_LOGGER, "constraint_ne_thermal_limit_from_parallel requires a rate_a value on all ne_branches, calc_thermal_limits! can be used to generate reasonable values")
end
ne_rate_a = ne_branch["rate_a"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
rate_a = branch["rate_a"]
constraint_ne_thermal_limit_from_parallel(pm, nw, br_idx, ne_br_idx, f_idx, rate_a, ne_rate_a)
end
""
function constraint_ne_thermal_limit_to_parallel(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int=_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
f_bus = ne_branch["f_bus"]
t_bus = ne_branch["t_bus"]
t_idx = (ne_br_idx, t_bus, f_bus)
if !haskey(ne_branch, "rate_a")
Memento.error(_LOGGER, "constraint_ne_thermal_limit_to_parallel requires a rate_a value on all ne_branches, calc_thermal_limits! can be used to generate reasonable values")
end
ne_rate_a = ne_branch["rate_a"]
br_idx = branch_idx(pm, nw, f_bus, t_bus)
branch = _PM.ref(pm, nw, :branch, br_idx)
rate_a = branch["rate_a"]
constraint_ne_thermal_limit_to_parallel(pm, nw, br_idx, ne_br_idx, t_idx, rate_a, ne_rate_a)
end
### Actual constraints ###
function constraint_ohms_yt_from_repl(pm::_PM.AbstractDCPModel, n::Int, ne_br_idx, f_bus, t_bus, f_idx, b, vad_min, vad_max)
p_fr = _PM.var(pm, n, :p, f_idx)
va_fr = _PM.var(pm, n, :va, f_bus)
va_to = _PM.var(pm, n, :va, t_bus)
z = _PM.var(pm, n, :branch_ne, ne_br_idx)
JuMP.@constraint(pm.model, p_fr <= -b*(va_fr - va_to + vad_max*z))
JuMP.@constraint(pm.model, p_fr >= -b*(va_fr - va_to + vad_min*z))
end
"nothing to do, this model is symetric"
function constraint_ohms_yt_to_repl(pm::_PM.AbstractAPLossLessModels, n::Int, ne_br_idx, f_bus, t_bus, t_idx, b, vad_min, vad_max)
end
function constraint_voltage_angle_difference_repl(pm::_PM.AbstractDCPModel, n::Int, ne_br_idx, f_idx, angmin, angmax, vad_min, vad_max)
i, f_bus, t_bus = f_idx
va_fr = _PM.var(pm, n, :va, f_bus)
va_to = _PM.var(pm, n, :va, t_bus)
z = _PM.var(pm, n, :branch_ne, ne_br_idx)
JuMP.@constraint(pm.model, va_fr - va_to <= angmax*(1-z) + vad_max*z)
JuMP.@constraint(pm.model, va_fr - va_to >= angmin*(1-z) + vad_min*z)
end
""
function constraint_thermal_limit_from_repl(pm::_PM.AbstractActivePowerModel, n::Int, ne_br_idx, f_idx, rate_a)
p_fr = _PM.var(pm, n, :p, f_idx)
z = _PM.var(pm, n, :branch_ne, ne_br_idx)
JuMP.@constraint(pm.model, p_fr <= rate_a*(1-z))
JuMP.@constraint(pm.model, p_fr >= -rate_a*(1-z))
end
""
function constraint_thermal_limit_to_repl(pm::_PM.AbstractActivePowerModel, n::Int, ne_br_idx, t_idx, rate_a)
p_to = _PM.var(pm, n, :p, t_idx)
z = _PM.var(pm, n, :branch_ne, ne_br_idx)
JuMP.@constraint(pm.model, p_to <= rate_a*(1-z))
JuMP.@constraint(pm.model, p_to >= -rate_a*(1-z))
end
#### Actual constraints used in radial networks ####
"States that at most one of the ne_branches sharing the same bus pair must be built."
function constraint_branch_complementarity(pm::_PM.AbstractPowerModel, n::Int, i, f_bus, t_bus)
JuMP.@constraint(pm.model, sum(_PM.var(pm, n, :branch_ne, l) for l in ne_branch_ids(pm, n, f_bus, t_bus)) <= 1)
end
""
function constraint_voltage_magnitude_difference_on_off(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
branch = _PM.ref(pm, n, :branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = fr_bus["vmax"]^2/tm^2 - to_bus["vmin"]^2
M_lo = -fr_bus["vmin"]^2/tm^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
w_fr = _PM.var(pm, n, :w, f_bus)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :z_branch, i)
JuMP.@constraint(pm.model, (w_fr/tm^2) - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, (w_fr/tm^2) - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
""
function constraint_voltage_magnitude_difference_frb_on_off(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
branch = _PM.ref(pm, n, :branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = 1.0^2/tm^2 - to_bus["vmin"]^2
M_lo = -1.0^2/tm^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :z_branch, i)
# w_fr is assumed equal to 1.0
JuMP.@constraint(pm.model, (1.0/tm^2) - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, (1.0/tm^2) - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
""
function constraint_voltage_magnitude_difference_oltc_on_off(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm_min, tm_max)
branch = _PM.ref(pm, n, :branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = 1.0^2/tm_min^2 - to_bus["vmin"]^2
M_lo = -1.0^2/tm_max^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
ttmi = _PM.var(pm, n, :ttmi, i)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :z_branch, i)
# w_fr is assumed equal to 1.0 to preserve the linearity of the model
JuMP.@constraint(pm.model, 1.0*ttmi - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, 1.0*ttmi - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
""
function constraint_ne_voltage_magnitude_difference_parallel(pm::_PM.AbstractBFAModel, n::Int, br_idx_e, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_e, x_e, g_sh_fr_e, b_sh_fr_e, r_c, x_c, g_sh_fr_c, b_sh_fr_c, tm)
# Suffixes: _e: existing branch; _c: candidate branch; _p: parallel equivalent
r_p = (r_e*(r_c^2+x_c^2)+r_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
x_p = (x_e*(r_c^2+x_c^2)+x_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
constraint_ne_voltage_magnitude_difference(pm::_PM.AbstractBFAModel, n::Int, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_p, x_p, 0, 0, tm)
end
""
function constraint_ne_voltage_magnitude_difference_frb_parallel(pm::_PM.AbstractBFAModel, n::Int, br_idx_e, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_e, x_e, g_sh_fr_e, b_sh_fr_e, r_c, x_c, g_sh_fr_c, b_sh_fr_c, tm)
# Suffixes: _e: existing branch; _c: candidate branch; _p: parallel equivalent
r_p = (r_e*(r_c^2+x_c^2)+r_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
x_p = (x_e*(r_c^2+x_c^2)+x_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
constraint_ne_voltage_magnitude_difference_frb(pm::_PM.AbstractBFAModel, n::Int, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_p, x_p, 0, 0, tm)
end
## Auxiliary functions ##
function replace_branch(pm, nw, f_bus, t_bus)
replace = 0
ne_br_idx = 0
for (br, ne_branch) in _PM.ref(pm, nw, :ne_branch)
if ((ne_branch["f_bus"] == f_bus && ne_branch["t_bus"] == t_bus) || (ne_branch["f_bus"] == t_bus && ne_branch["t_bus"] == f_bus)) && ne_branch["replace"] == true
replace = 1
ne_br_idx = br
end
end
return replace, ne_br_idx
end
"Returns the index of `branch` connecting `f_bus` to `t_bus`, if such a `branch` exists; 0 otherwise"
function branch_idx(pm::_PM.AbstractPowerModel, nw::Int, f_bus, t_bus)
buspairs = _PM.ref(pm, nw, :buspairs)
buspair = get(buspairs, (f_bus,t_bus), Dict("branch"=>0))
return buspair["branch"]
end
"Returns a list of indices of `ne_branch`es relative to `branch` `br_idx`"
function ne_branch_ids(pm::_PM.AbstractPowerModel, br_idx::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :branch, br_idx)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
ne_branch_ids(pm, nw, f_bus, t_bus)
end
"Returns a list of indices of `ne_branch`es connecting `f_bus` to `t_bus`"
function ne_branch_ids(pm::_PM.AbstractPowerModel, nw::Int, f_bus, t_bus)
ne_buspairs = _PM.ref(pm, nw, :ne_buspairs)
ne_buspair = get(ne_buspairs, (f_bus,t_bus), Dict("branches"=>Int[]))
return ne_buspair["branches"]
end
"Returns whether a `ne_branch` is intended to replace the existing branch or to be added in parallel"
function ne_branch_replace(pm::_PM.AbstractPowerModel, ne_br_idx::Int; nw::Int=_PM.nw_id_default)
ne_branch = _PM.ref(pm, nw, :ne_branch, ne_br_idx)
if !haskey(ne_branch, "replace")
Memento.error(_LOGGER, "a `replace` value is required on all `ne_branch`es")
end
return ne_branch["replace"] == 1
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 10684 | ## Objective with candidate storage
function objective_min_cost_storage(pm::_PM.AbstractPowerModel)
cost = JuMP.AffExpr(0.0)
# Investment cost
for n in nw_ids(pm; hour=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(pm,n))
end
# Operation cost
for n in nw_ids(pm)
JuMP.add_to_expression!(cost, calc_gen_cost(pm,n))
end
JuMP.@objective(pm.model, Min, cost)
end
function objective_min_cost_storage(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractPowerModel)
cost = JuMP.AffExpr(0.0)
# Transmission investment cost
for n in nw_ids(t_pm; hour=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(t_pm,n))
end
# Transmission operation cost
for n in nw_ids(t_pm)
JuMP.add_to_expression!(cost, calc_gen_cost(t_pm,n))
end
# Distribution investment cost
for n in nw_ids(d_pm; hour=1)
# Note: distribution networks do not have DC components (modeling decision)
JuMP.add_to_expression!(cost, calc_ne_branch_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(d_pm,n))
end
# Distribution operation cost
for n in nw_ids(d_pm)
JuMP.add_to_expression!(cost, calc_gen_cost(d_pm,n))
end
JuMP.@objective(t_pm.model, Min, cost) # Note: t_pm.model == d_pm.model
end
## Objective with candidate storage and flexible demand
function objective_min_cost_flex(pm::_PM.AbstractPowerModel; investment=true, operation=true)
cost = JuMP.AffExpr(0.0)
# Investment cost
if investment
for n in nw_ids(pm; hour=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(pm,n))
end
end
# Operation cost
if operation
for n in nw_ids(pm)
JuMP.add_to_expression!(cost, calc_gen_cost(pm,n))
JuMP.add_to_expression!(cost, calc_load_operation_cost(pm,n))
end
end
JuMP.@objective(pm.model, Min, cost)
end
function objective_min_cost_flex(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractPowerModel)
cost = JuMP.AffExpr(0.0)
# Transmission investment cost
for n in nw_ids(t_pm; hour=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(t_pm,n))
end
# Transmission operation cost
for n in nw_ids(t_pm)
JuMP.add_to_expression!(cost, calc_gen_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_load_operation_cost(t_pm,n))
end
# Distribution investment cost
for n in nw_ids(d_pm; hour=1)
# Note: distribution networks do not have DC components (modeling decision)
JuMP.add_to_expression!(cost, calc_ne_branch_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(d_pm,n))
end
# Distribution operation cost
for n in nw_ids(d_pm)
JuMP.add_to_expression!(cost, calc_gen_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_load_operation_cost(d_pm,n))
end
JuMP.@objective(t_pm.model, Min, cost) # Note: t_pm.model == d_pm.model
end
## Stochastic objective with candidate storage and flexible demand
function objective_stoch_flex(pm::_PM.AbstractPowerModel; investment=true, operation=true)
cost = JuMP.AffExpr(0.0)
# Investment cost
if investment
for n in nw_ids(pm; hour=1, scenario=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(pm,n))
end
end
# Operation cost
if operation
for (s, scenario) in dim_prop(pm, :scenario)
scenario_probability = scenario["probability"]
for n in nw_ids(pm; scenario=s)
JuMP.add_to_expression!(cost, scenario_probability, calc_gen_cost(pm,n))
JuMP.add_to_expression!(cost, scenario_probability, calc_load_operation_cost(pm,n))
end
end
end
JuMP.@objective(pm.model, Min, cost)
end
function objective_stoch_flex(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractPowerModel)
cost = JuMP.AffExpr(0.0)
# Transmission investment cost
for n in nw_ids(t_pm; hour=1, scenario=1)
JuMP.add_to_expression!(cost, calc_convdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_branch_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_branchdc_ne_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(t_pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(t_pm,n))
end
# Transmission operation cost
for (s, scenario) in dim_prop(t_pm, :scenario)
scenario_probability = scenario["probability"]
for n in nw_ids(t_pm; scenario=s)
JuMP.add_to_expression!(cost, scenario_probability, calc_gen_cost(t_pm,n))
JuMP.add_to_expression!(cost, scenario_probability, calc_load_operation_cost(t_pm,n))
end
end
# Distribution investment cost
for n in nw_ids(d_pm; hour=1, scenario=1)
# Note: distribution networks do not have DC components (modeling decision)
JuMP.add_to_expression!(cost, calc_ne_branch_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_ne_storage_cost(d_pm,n))
JuMP.add_to_expression!(cost, calc_load_investment_cost(d_pm,n))
end
# Distribution operation cost
for (s, scenario) in dim_prop(d_pm, :scenario)
scenario_probability = scenario["probability"]
for n in nw_ids(d_pm; scenario=s)
JuMP.add_to_expression!(cost, scenario_probability, calc_gen_cost(d_pm,n))
JuMP.add_to_expression!(cost, scenario_probability, calc_load_operation_cost(d_pm,n))
end
end
JuMP.@objective(t_pm.model, Min, cost) # Note: t_pm.model == d_pm.model
end
## Auxiliary functions
function calc_gen_cost(pm::_PM.AbstractPowerModel, n::Int)
cost = JuMP.AffExpr(0.0)
for (i,g) in _PM.ref(pm, n, :gen)
if length(g["cost"]) ≥ 2
JuMP.add_to_expression!(cost, g["cost"][end-1], _PM.var(pm,n,:pg,i))
end
end
if get(pm.setting, "add_co2_cost", false)
co2_emission_cost = pm.ref[:it][_PM.pm_it_sym][:co2_emission_cost]
for (i,g) in _PM.ref(pm, n, :dgen)
JuMP.add_to_expression!(cost, g["emission_factor"]*co2_emission_cost, _PM.var(pm,n,:pg,i))
end
end
for (i,g) in _PM.ref(pm, n, :ndgen)
JuMP.add_to_expression!(cost, g["cost_curt"], _PM.var(pm,n,:pgcurt,i))
end
return cost
end
function calc_convdc_ne_cost(pm::_PM.AbstractPowerModel, n::Int)
add_co2_cost = get(pm.setting, "add_co2_cost", false)
cost = JuMP.AffExpr(0.0)
for (i,conv) in get(_PM.ref(pm,n), :convdc_ne, Dict())
conv_cost = conv["cost"]
if add_co2_cost
conv_cost += conv["co2_cost"]
end
JuMP.add_to_expression!(cost, conv_cost, _PM.var(pm,n,:conv_ne_investment,i))
end
return cost
end
function calc_ne_branch_cost(pm::_PM.AbstractPowerModel, n::Int)
add_co2_cost = get(pm.setting, "add_co2_cost", false)
cost = JuMP.AffExpr(0.0)
for (i,branch) in get(_PM.ref(pm,n), :ne_branch, Dict())
branch_cost = branch["construction_cost"]
if add_co2_cost
branch_cost += branch["co2_cost"]
end
JuMP.add_to_expression!(cost, branch_cost, _PM.var(pm,n,:branch_ne_investment,i))
end
return cost
end
function calc_branchdc_ne_cost(pm::_PM.AbstractPowerModel, n::Int)
add_co2_cost = get(pm.setting, "add_co2_cost", false)
cost = JuMP.AffExpr(0.0)
for (i,branch) in get(_PM.ref(pm,n), :branchdc_ne, Dict())
branch_cost = branch["cost"]
if add_co2_cost
branch_cost += branch["co2_cost"]
end
JuMP.add_to_expression!(cost, branch_cost, _PM.var(pm,n,:branchdc_ne_investment,i))
end
return cost
end
function calc_ne_storage_cost(pm::_PM.AbstractPowerModel, n::Int)
add_co2_cost = get(pm.setting, "add_co2_cost", false)
cost = JuMP.AffExpr(0.0)
for (i,storage) in get(_PM.ref(pm,n), :ne_storage, Dict())
storage_cost = storage["eq_cost"] + storage["inst_cost"]
if add_co2_cost
storage_cost += storage["co2_cost"]
end
JuMP.add_to_expression!(cost, storage_cost, _PM.var(pm,n,:z_strg_ne_investment,i))
end
return cost
end
function calc_load_operation_cost(pm::_PM.AbstractPowerModel, n::Int)
cost = JuMP.AffExpr(0.0)
for (i,l) in _PM.ref(pm, n, :flex_load)
JuMP.add_to_expression!(cost, 0.5*l["cost_shift"], _PM.var(pm,n,:pshift_up,i)) # Splitting into half and half allows for better cost attribution when running single-period problems or problems with no integral constraints.
JuMP.add_to_expression!(cost, 0.5*l["cost_shift"], _PM.var(pm,n,:pshift_down,i))
JuMP.add_to_expression!(cost, l["cost_red"], _PM.var(pm,n,:pred,i))
JuMP.add_to_expression!(cost, l["cost_curt"], _PM.var(pm,n,:pcurt,i))
end
for (i,l) in _PM.ref(pm, n, :fixed_load)
JuMP.add_to_expression!(cost, l["cost_curt"], _PM.var(pm,n,:pcurt,i))
end
return cost
end
function calc_load_investment_cost(pm::_PM.AbstractPowerModel, n::Int)
add_co2_cost = get(pm.setting, "add_co2_cost", false)
cost = JuMP.AffExpr(0.0)
for (i,l) in _PM.ref(pm, n, :flex_load)
load_cost = l["cost_inv"]
if add_co2_cost
load_cost += l["co2_cost"]
end
JuMP.add_to_expression!(cost, load_cost, _PM.var(pm,n,:z_flex_investment,i))
end
return cost
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 8402 | ## Generators
"Add to `ref` the keys for handling dispatchable and non-dispatchable generators"
function ref_add_gen!(ref::Dict{Symbol,<:Any}, data::Dict{String,<:Any})
for (n, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
# Dispatchable generators. Their power varies between `pmin` and `pmax` and cannot be curtailed.
nw_ref[:dgen] = Dict(x for x in nw_ref[:gen] if x.second["dispatchable"] == true)
# Non-dispatchable generators. Their reference power `pref` can be curtailed.
nw_ref[:ndgen] = Dict(x for x in nw_ref[:gen] if x.second["dispatchable"] == false)
end
end
## Storage
function ref_add_storage!(ref::Dict{Symbol,<:Any}, data::Dict{String,<:Any})
for (n, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
if haskey(nw_ref, :storage)
nw_ref[:storage_bounded_absorption] = Dict(x for x in nw_ref[:storage] if 0.0 < get(x.second, "max_energy_absorption", Inf) < Inf)
end
end
end
function ref_add_ne_storage!(ref::Dict{Symbol,<:Any}, data::Dict{String,<:Any})
for (n, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
if haskey(nw_ref, :ne_storage)
bus_storage_ne = Dict([(i, []) for (i,bus) in nw_ref[:bus]])
for (i,storage) in nw_ref[:ne_storage]
push!(bus_storage_ne[storage["storage_bus"]], i)
end
nw_ref[:bus_storage_ne] = bus_storage_ne
nw_ref[:ne_storage_bounded_absorption] = Dict(x for x in nw_ref[:ne_storage] if 0.0 < get(x.second, "max_energy_absorption", Inf) < Inf)
end
end
end
## Flexible loads
"Add to `ref` the keys for handling flexible demand"
function ref_add_flex_load!(ref::Dict{Symbol,<:Any}, data::Dict{String,<:Any})
for (n, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
# Loads that can be made flexible, depending on investment decision
nw_ref[:flex_load] = Dict(x for x in nw_ref[:load] if x.second["flex"] == 1)
# Loads that are not flexible and do not have an associated investment decision
nw_ref[:fixed_load] = Dict(x for x in nw_ref[:load] if x.second["flex"] == 0)
end
# Compute the total energy demand of each flex load and store it in the first hour nw
for nw in nw_ids(data; hour = 1)
if haskey(ref[:it][_PM.pm_it_sym][:nw][nw], :time_elapsed)
time_elapsed = ref[:it][_PM.pm_it_sym][:nw][nw][:time_elapsed]
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
timeseries_nw_ids = similar_ids(data, nw, hour = 1:dim_length(data,:hour))
for (l, load) in ref[:it][_PM.pm_it_sym][:nw][nw][:flex_load]
# `ref` instead of `data` must be used to access loads, since the former has
# already been filtered to remove inactive loads.
load["ed"] = time_elapsed * sum(ref[:it][_PM.pm_it_sym][:nw][n][:load][l]["pd"] for n in timeseries_nw_ids)
end
end
end
## Distribution networks
"Like ref_add_ne_branch!, but ne_buspairs are built using _calc_buspair_parameters_allbranches"
function ref_add_ne_branch_allbranches!(ref::Dict{Symbol,<:Any}, data::Dict{String,<:Any})
for (nw, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
if !haskey(nw_ref, :ne_branch)
Memento.error(_LOGGER, "required ne_branch data not found")
end
nw_ref[:ne_branch] = Dict(x for x in nw_ref[:ne_branch] if (x.second["br_status"] == 1 && x.second["f_bus"] in keys(nw_ref[:bus]) && x.second["t_bus"] in keys(nw_ref[:bus])))
nw_ref[:ne_arcs_from] = [(i,branch["f_bus"],branch["t_bus"]) for (i,branch) in nw_ref[:ne_branch]]
nw_ref[:ne_arcs_to] = [(i,branch["t_bus"],branch["f_bus"]) for (i,branch) in nw_ref[:ne_branch]]
nw_ref[:ne_arcs] = [nw_ref[:ne_arcs_from]; nw_ref[:ne_arcs_to]]
ne_bus_arcs = Dict((i, []) for (i,bus) in nw_ref[:bus])
for (l,i,j) in nw_ref[:ne_arcs]
push!(ne_bus_arcs[i], (l,i,j))
end
nw_ref[:ne_bus_arcs] = ne_bus_arcs
if !haskey(nw_ref, :ne_buspairs)
ismc = haskey(nw_ref, :conductors)
cid = nw_ref[:conductor_ids]
nw_ref[:ne_buspairs] = _calc_buspair_parameters_allbranches(nw_ref[:bus], nw_ref[:ne_branch], cid, ismc)
end
end
end
"""
Add to `ref` the following keys:
- `:frb_branch`: the set of `branch`es whose `f_bus` is the reference bus;
- `:frb_ne_branch`: the set of `ne_branch`es whose `f_bus` is the reference bus.
"""
function ref_add_frb_branch!(ref::Dict{Symbol,Any}, data::Dict{String,<:Any})
for (nw, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
ref_bus_id = first(keys(nw_ref[:ref_buses]))
frb_branch = Dict{Int,Any}()
for (i,br) in nw_ref[:branch]
if br["f_bus"] == ref_bus_id
frb_branch[i] = br
end
end
nw_ref[:frb_branch] = frb_branch
if haskey(nw_ref, :ne_branch)
frb_ne_branch = Dict{Int,Any}()
for (i,br) in nw_ref[:ne_branch]
if br["f_bus"] == ref_bus_id
frb_ne_branch[i] = br
end
end
nw_ref[:frb_ne_branch] = frb_ne_branch
end
end
end
"""
Add to `ref` the following keys:
- `:oltc_branch`: the set of `frb_branch`es that are OLTCs;
- `:oltc_ne_branch`: the set of `frb_ne_branch`es that are OLTCs.
"""
function ref_add_oltc_branch!(ref::Dict{Symbol,Any}, data::Dict{String,<:Any})
for (nw, nw_ref) in ref[:it][_PM.pm_it_sym][:nw]
if !haskey(nw_ref, :frb_branch)
Memento.error(_LOGGER, "ref_add_oltc_branch! must be called after ref_add_frb_branch!")
end
oltc_branch = Dict{Int,Any}()
for (i,br) in nw_ref[:frb_branch]
if br["transformer"] && haskey(br, "tm_min") && haskey(br, "tm_max") && br["tm_min"] < br["tm_max"]
oltc_branch[i] = br
end
end
nw_ref[:oltc_branch] = oltc_branch
if haskey(nw_ref, :frb_ne_branch)
oltc_ne_branch = Dict{Int,Any}()
for (i,br) in nw_ref[:ne_branch]
if br["transformer"] && haskey(br, "tm_min") && haskey(br, "tm_max") && br["tm_min"] < br["tm_max"]
oltc_ne_branch[i] = br
end
end
nw_ref[:oltc_ne_branch] = oltc_ne_branch
end
end
end
"Like PowerModels.calc_buspair_parameters, but retains indices of all the branches and drops keys that depend on branch"
function _calc_buspair_parameters_allbranches(buses, branches, conductor_ids, ismulticondcutor)
bus_lookup = Dict(bus["index"] => bus for (i,bus) in buses if bus["bus_type"] != 4)
branch_lookup = Dict(branch["index"] => branch for (i,branch) in branches if branch["br_status"] == 1 && haskey(bus_lookup, branch["f_bus"]) && haskey(bus_lookup, branch["t_bus"]))
buspair_indexes = Set((branch["f_bus"], branch["t_bus"]) for (i,branch) in branch_lookup)
bp_branch = Dict((bp, Int[]) for bp in buspair_indexes)
if ismulticondcutor
bp_angmin = Dict((bp, [-Inf for c in conductor_ids]) for bp in buspair_indexes)
bp_angmax = Dict((bp, [ Inf for c in conductor_ids]) for bp in buspair_indexes)
else
@assert(length(conductor_ids) == 1)
bp_angmin = Dict((bp, -Inf) for bp in buspair_indexes)
bp_angmax = Dict((bp, Inf) for bp in buspair_indexes)
end
for (l,branch) in branch_lookup
i = branch["f_bus"]
j = branch["t_bus"]
if ismulticondcutor
for c in conductor_ids
bp_angmin[(i,j)][c] = max(bp_angmin[(i,j)][c], branch["angmin"][c])
bp_angmax[(i,j)][c] = min(bp_angmax[(i,j)][c], branch["angmax"][c])
end
else
bp_angmin[(i,j)] = max(bp_angmin[(i,j)], branch["angmin"])
bp_angmax[(i,j)] = min(bp_angmax[(i,j)], branch["angmax"])
end
bp_branch[(i,j)] = push!(bp_branch[(i,j)], l)
end
buspairs = Dict((i,j) => Dict(
"branches"=>bp_branch[(i,j)],
"angmin"=>bp_angmin[(i,j)],
"angmax"=>bp_angmax[(i,j)],
"vm_fr_min"=>bus_lookup[i]["vmin"],
"vm_fr_max"=>bus_lookup[i]["vmax"],
"vm_to_min"=>bus_lookup[j]["vmin"],
"vm_to_max"=>bus_lookup[j]["vmax"]
) for (i,j) in buspair_indexes
)
return buspairs
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 385 | """
sol_pm!(pm, solution)
Make `pm` available in `solution["pm"]`.
If `sol_pm!` is used as solution processor when running a model, then `pm` will be available
in `result["solution"]["pm"]` (where `result` is the name of the returned Dict) after the
optimization has ended.
"""
function sol_pm!(pm::_PM.AbstractPowerModel, solution::Dict{String,Any})
solution["pm"] = pm
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 27714 | ##################################################################################
#### DEFINTION OF NEW VARIABLES FOR STORAGE INVESTMENTS ACCODING TO FlexPlan MODEL
##################################################################################
function variable_absorbed_energy(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool = true, report::Bool=true)
e_abs = _PM.var(pm, nw)[:e_abs] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :storage_bounded_absorption)], base_name="$(nw)_e_abs",
start = 0)
if bounded
for (s, storage) in _PM.ref(pm, nw, :storage_bounded_absorption)
JuMP.set_lower_bound(e_abs[s], 0)
JuMP.set_upper_bound(e_abs[s], storage["max_energy_absorption"])
end
end
report && _PM.sol_component_value(pm, nw, :storage, :e_abs, _PM.ids(pm, nw, :storage_bounded_absorption), e_abs)
end
function variable_absorbed_energy_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool = true, report::Bool=true)
e_abs = _PM.var(pm, nw)[:e_abs_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage_bounded_absorption)], base_name="$(nw)_e_abs_ne",
start = 0)
if bounded
for (s, storage) in _PM.ref(pm, nw, :ne_storage_bounded_absorption)
JuMP.set_lower_bound(e_abs[s], 0)
JuMP.set_upper_bound(e_abs[s], storage["max_energy_absorption"])
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :e_abs_ne, _PM.ids(pm, nw, :ne_storage_bounded_absorption), e_abs)
end
function variable_storage_power_ne(pm::_PM.AbstractPowerModel; investment::Bool=true, kwargs...)
variable_storage_power_real_ne(pm; kwargs...)
variable_storage_power_imaginary_ne(pm; kwargs...)
variable_storage_power_control_imaginary_ne(pm; kwargs...)
variable_storage_current_ne(pm; kwargs...)
variable_storage_energy_ne(pm; kwargs...)
variable_storage_charge_ne(pm; kwargs...)
variable_storage_discharge_ne(pm; kwargs...)
variable_storage_indicator(pm; kwargs..., relax=true)
investment && variable_storage_investment(pm; kwargs...)
end
function variable_storage_power_real_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
ps = _PM.var(pm, nw)[:ps_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_ps_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "ps_start")
)
if bounded
inj_lb, inj_ub = _PM.ref_calc_storage_injection_bounds(_PM.ref(pm, nw, :ne_storage), _PM.ref(pm, nw, :bus))
for i in _PM.ids(pm, nw, :ne_storage)
if !isinf(inj_lb[i])
JuMP.set_lower_bound(ps[i], inj_lb[i])
end
if !isinf(inj_ub[i])
JuMP.set_upper_bound(ps[i], inj_ub[i])
end
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :ps_ne, _PM.ids(pm, nw, :ne_storage), ps)
end
function variable_storage_power_imaginary_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
qs = _PM.var(pm, nw)[:qs_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_qs_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "qs_start")
)
if bounded
inj_lb, inj_ub = _PM.ref_calc_storage_injection_bounds(_PM.ref(pm, nw, :ne_storage), _PM.ref(pm, nw, :bus))
for (i, storage) in _PM.ref(pm, nw, :ne_storage)
JuMP.set_lower_bound(qs[i], max(inj_lb[i], storage["qmin"]))
JuMP.set_upper_bound(qs[i], min(inj_ub[i], storage["qmax"]))
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :qs_ne, _PM.ids(pm, nw, :ne_storage), qs)
end
"apo models ignore reactive power flows"
function variable_storage_power_imaginary_ne(pm::_PM.AbstractActivePowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
report && _PM.sol_component_fixed(pm, nw, :ne_storage, :qs_ne, _PM.ids(pm, nw, :ne_storage), NaN)
end
function variable_storage_power_control_imaginary_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
qsc = _PM.var(pm, nw)[:qsc_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_qsc_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "qsc_start")
)
if bounded
inj_lb, inj_ub = _PM.ref_calc_storage_injection_bounds(_PM.ref(pm, nw, :ne_storage), _PM.ref(pm, nw, :bus))
for (i,storage) in _PM.ref(pm, nw, :ne_storage)
if !isinf(inj_lb[i]) || haskey(storage, "qmin")
JuMP.set_lower_bound(qsc[i], max(inj_lb[i], get(storage, "qmin", -Inf)))
end
if !isinf(inj_ub[i]) || haskey(storage, "qmax")
JuMP.set_upper_bound(qsc[i], min(inj_ub[i], get(storage, "qmax", Inf)))
end
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :qsc_ne, _PM.ids(pm, nw, :ne_storage), qsc)
end
"apo models ignore reactive power flows"
function variable_storage_power_control_imaginary_ne(pm::_PM.AbstractActivePowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
report && _PM.sol_component_fixed(pm, nw, :ne_storage, :qsc_ne, _PM.ids(pm, nw, :ne_storage), NaN)
end
"do nothing by default but some formulations require this"
function variable_storage_current_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
end
function variable_storage_energy_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
se = _PM.var(pm, nw)[:se_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_se_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "se_start", 1)
)
if bounded
for (i, storage) in _PM.ref(pm, nw, :ne_storage)
JuMP.set_lower_bound(se[i], 0)
JuMP.set_upper_bound(se[i], storage["energy_rating"])
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :se_ne, _PM.ids(pm, nw, :ne_storage), se)
end
function variable_storage_charge_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
sc = _PM.var(pm, nw)[:sc_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_sc_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "sc_start", 1)
)
if bounded
for (i, storage) in _PM.ref(pm, nw, :ne_storage)
JuMP.set_lower_bound(sc[i], 0)
JuMP.set_upper_bound(sc[i], storage["charge_rating"])
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :sc_ne, _PM.ids(pm, nw, :ne_storage), sc)
end
function variable_storage_discharge_ne(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
sd = _PM.var(pm, nw)[:sd_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_sd_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_storage, i), "sd_start", 1)
)
if bounded
for (i, storage) in _PM.ref(pm, nw, :ne_storage)
JuMP.set_lower_bound(sd[i], 0)
JuMP.set_upper_bound(sd[i], storage["discharge_rating"])
end
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :sd_ne, _PM.ids(pm, nw, :ne_storage), sd)
end
function variable_storage_indicator(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
z = _PM.var(pm, nw)[:z_strg_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_z_strg_ne",
binary = true,
start = 0
)
else
z = _PM.var(pm, nw)[:z_strg_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_z_strg_ne",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
z = _PM.var(pm, nw)[:z_strg_ne] = _PM.var(pm, first_n)[:z_strg_ne]
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :isbuilt, _PM.ids(pm, nw, :ne_storage), z)
end
function variable_storage_investment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
investment = _PM.var(pm, nw)[:z_strg_ne_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_z_strg_ne_investment",
binary = true,
start = 0
)
else
investment = _PM.var(pm, nw)[:z_strg_ne_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :ne_storage)], base_name="$(nw)_z_strg_ne_investment",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
investment = _PM.var(pm, nw)[:z_strg_ne_investment] = _PM.var(pm, first_n)[:z_strg_ne_investment]
end
report && _PM.sol_component_value(pm, nw, :ne_storage, :investment, _PM.ids(pm, nw, :ne_storage), investment)
end
# ####################################################
# Constraint Templates: They are used to do all data manipuations and return a function with the same name,
# this way the constraint itself only containts the mathematical formulation
# ###################################################
function constraint_storage_thermal_limit_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :ne_storage, i)
constraint_storage_thermal_limit_ne(pm, nw, i, storage["thermal_rating"])
end
function constraint_storage_losses_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :ne_storage, i)
constraint_storage_losses_ne(pm, nw, i, storage["storage_bus"], storage["r"], storage["x"], storage["p_loss"], storage["q_loss"])
end
function constraint_storage_bounds_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
constraint_storage_bounds_ne(pm, nw, i)
end
function constraint_storage_state(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :storage, i)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_storage_state_initial(pm, nw, i, storage["energy"], storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
end
function constraint_storage_state_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :ne_storage, i)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_storage_state_initial_ne(pm, nw, i, storage["energy"], storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
end
function constraint_storage_state(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
storage = _PM.ref(pm, nw_2, :storage, i)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
if haskey(_PM.ref(pm, nw_1, :storage), i)
constraint_storage_state(pm, nw_1, nw_2, i, storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
else
# if the storage device has status=0 in nw_1, then the stored energy variable will not exist. Initialize storage from data model instead.
Memento.warn(_LOGGER, "storage component $(i) was not found in network $(nw_1) while building constraint_storage_state between networks $(nw_1) and $(nw_2). Using the energy value from the storage component in network $(nw_2) instead")
constraint_storage_state_initial(pm, nw_2, i, storage["energy"], storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
end
end
function constraint_storage_state_ne(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
storage = _PM.ref(pm, nw_2, :ne_storage, i)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
if haskey(_PM.ref(pm, nw_1, :ne_storage), i)
constraint_storage_state_ne(pm, nw_1, nw_2, i, storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
else
# if the storage device has status=0 in nw_1, then the stored energy variable will not exist. Initialize storage from data model instead.
Memento.warn(_LOGGER, "storage component $(i) was not found in network $(nw_1) while building constraint_storage_state between networks $(nw_1) and $(nw_2). Using the energy value from the storage component in network $(nw_2) instead")
constraint_storage_state_initial_ne(pm, nw_2, i, storage["energy"], storage["charge_efficiency"], storage["discharge_efficiency"], storage["stationary_energy_inflow"], storage["stationary_energy_outflow"], storage["self_discharge_rate"], time_elapsed)
end
end
function constraint_storage_state_final(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :storage, i)
constraint_storage_state_final(pm, nw, i, storage["energy"])
end
function constraint_storage_state_final_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :ne_storage, i)
constraint_storage_state_final_ne(pm, nw, i, storage["energy"])
end
function constraint_storage_excl_slack(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
constraint_storage_excl_slack(pm, nw, i)
end
function constraint_storage_excl_slack_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
constraint_storage_excl_slack_ne(pm, nw, i)
end
function constraint_maximum_absorption(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :storage, i)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_maximum_absorption_initial(pm, nw, i, time_elapsed)
end
function constraint_maximum_absorption_ne(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
storage = _PM.ref(pm, nw, :ne_storage, i)
if haskey(_PM.ref(pm, nw), :time_elapsed)
time_elapsed = _PM.ref(pm, nw, :time_elapsed)
else
Memento.warn(_LOGGER, "network data should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
constraint_maximum_absorption_initial_ne(pm, nw, i, time_elapsed)
end
function constraint_maximum_absorption(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
storage = _PM.ref(pm, nw_2, :storage, i)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
if haskey(_PM.ref(pm, nw_1, :storage), i)
constraint_maximum_absorption(pm, nw_1, nw_2, i, time_elapsed)
else
# if the storage device has status=0 in nw_1, then the stored energy variable will not exist. Initialize storage from data model instead.
Memento.warn(_LOGGER, "storage component $(i) was not found in network $(nw_1) while building constraint_storage_state between networks $(nw_1) and $(nw_2). Using the energy value from the storage component in network $(nw_2) instead")
constraint_maximum_absorption_initial(pm, nw_2, i, time_elapsed)
end
end
function constraint_maximum_absorption_ne(pm::_PM.AbstractPowerModel, i::Int, nw_1::Int, nw_2::Int)
storage = _PM.ref(pm, nw_2, :ne_storage, i)
if haskey(_PM.ref(pm, nw_2), :time_elapsed)
time_elapsed = _PM.ref(pm, nw_2, :time_elapsed)
else
Memento.warn(_LOGGER, "network $(nw_2) should specify time_elapsed, using 1.0 as a default")
time_elapsed = 1.0
end
if haskey(_PM.ref(pm, nw_1, :ne_storage), i)
constraint_maximum_absorption_ne(pm, nw_1, nw_2, i, time_elapsed)
else
# if the storage device has status=0 in nw_1, then the stored energy variable will not exist. Initialize storage from data model instead.
Memento.warn(_LOGGER, "storage component $(i) was not found in network $(nw_1) while building constraint_storage_state between networks $(nw_1) and $(nw_2). Using the energy value from the storage component in network $(nw_2) instead")
constraint_maximum_absorption_initial_ne(pm, nw_2, i, time_elapsed)
end
end
function constraint_ne_storage_activation(pm::_PM.AbstractPowerModel, i::Int, prev_nws::Vector{Int}, nw::Int)
investment_horizon = [nw]
lifetime = _PM.ref(pm, nw, :ne_storage, i, "lifetime")
for n in Iterators.reverse(prev_nws[max(end-lifetime+2,1):end])
i in _PM.ids(pm, n, :ne_storage) ? push!(investment_horizon, n) : break
end
constraint_ne_storage_activation(pm, nw, i, investment_horizon)
end
####################################################
############### Constraints
###################################################
function _PM.constraint_storage_thermal_limit(pm::BFARadPowerModel, n::Int, i, rating)
ps = _PM.var(pm, n, :ps, i)
qs = _PM.var(pm, n, :qs, i)
c_perp = cos(π/8) # ~0.92
c_diag = sin(π/8) + cos(π/8) # == cos(π/8) * sqrt(2), ~1.31
JuMP.@constraint(pm.model, ps >= -c_perp*rating)
JuMP.@constraint(pm.model, ps <= c_perp*rating)
JuMP.@constraint(pm.model, qs >= -c_perp*rating)
JuMP.@constraint(pm.model, qs <= c_perp*rating)
JuMP.@constraint(pm.model, ps + qs >= -c_diag*rating)
JuMP.@constraint(pm.model, ps + qs <= c_diag*rating)
JuMP.@constraint(pm.model, ps - qs >= -c_diag*rating)
JuMP.@constraint(pm.model, ps - qs <= c_diag*rating)
end
function constraint_storage_thermal_limit_ne(pm::_PM.AbstractActivePowerModel, n::Int, i, rating)
ps = _PM.var(pm, n, :ps_ne, i)
JuMP.lower_bound(ps) < -rating && JuMP.set_lower_bound(ps, -rating)
JuMP.upper_bound(ps) > rating && JuMP.set_upper_bound(ps, rating)
end
function constraint_storage_thermal_limit_ne(pm::BFARadPowerModel, n::Int, i, rating)
ps = _PM.var(pm, n, :ps_ne, i)
qs = _PM.var(pm, n, :qs_ne, i)
c_perp = cos(π/8) # ~0.92
c_diag = sin(π/8) + cos(π/8) # == cos(π/8) * sqrt(2), ~1.31
JuMP.@constraint(pm.model, ps >= -c_perp*rating)
JuMP.@constraint(pm.model, ps <= c_perp*rating)
JuMP.@constraint(pm.model, qs >= -c_perp*rating)
JuMP.@constraint(pm.model, qs <= c_perp*rating)
JuMP.@constraint(pm.model, ps + qs >= -c_diag*rating)
JuMP.@constraint(pm.model, ps + qs <= c_diag*rating)
JuMP.@constraint(pm.model, ps - qs >= -c_diag*rating)
JuMP.@constraint(pm.model, ps - qs <= c_diag*rating)
end
function constraint_storage_losses_ne(pm::_PM.AbstractAPLossLessModels, n::Int, i, bus, r, x, p_loss, q_loss)
ps = _PM.var(pm, n, :ps_ne, i)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
JuMP.@constraint(pm.model, ps + (sd - sc) == p_loss)
end
"Neglects the active and reactive loss terms associated with the squared current magnitude."
function constraint_storage_losses_ne(pm::_PM.AbstractBFAModel, n::Int, i, bus, r, x, p_loss, q_loss)
ps = _PM.var(pm, n, :ps_ne, i)
qs = _PM.var(pm, n, :qs_ne, i)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
qsc = _PM.var(pm, n, :qsc_ne, i)
JuMP.@constraint(pm.model, ps + (sd - sc) == p_loss)
JuMP.@constraint(pm.model, qs == qsc + q_loss)
end
function constraint_storage_state_initial(pm::_PM.AbstractPowerModel, n::Int, i::Int, energy, charge_eff, discharge_eff, inflow, outflow, self_discharge_rate, time_elapsed)
sc = _PM.var(pm, n, :sc, i)
sd = _PM.var(pm, n, :sd, i)
se = _PM.var(pm, n, :se, i)
JuMP.@constraint(pm.model, se == ((1-self_discharge_rate)^time_elapsed)*energy + time_elapsed*(charge_eff*sc - sd/discharge_eff + inflow - outflow))
end
function constraint_storage_state_initial_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int, energy, charge_eff, discharge_eff, inflow, outflow, self_discharge_rate, time_elapsed)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
se = _PM.var(pm, n, :se_ne, i)
z = _PM.var(pm, n, :z_strg_ne, i)
JuMP.@constraint(pm.model, se == ((1-self_discharge_rate)^time_elapsed)*energy*z + time_elapsed*(charge_eff*sc - sd/discharge_eff + inflow * z - outflow * z))
end
function constraint_storage_state(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, charge_eff, discharge_eff, inflow, outflow, self_discharge_rate, time_elapsed)
sc_2 = _PM.var(pm, n_2, :sc, i)
sd_2 = _PM.var(pm, n_2, :sd, i)
se_2 = _PM.var(pm, n_2, :se, i)
se_1 = _PM.var(pm, n_1, :se, i)
JuMP.@constraint(pm.model, se_2 == ((1-self_discharge_rate)^time_elapsed)*se_1 + time_elapsed*(charge_eff*sc_2 - sd_2/discharge_eff + inflow - outflow))
end
function constraint_storage_state_ne(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, charge_eff, discharge_eff, inflow, outflow, self_discharge_rate, time_elapsed)
sc_2 = _PM.var(pm, n_2, :sc_ne, i)
sd_2 = _PM.var(pm, n_2, :sd_ne, i)
se_2 = _PM.var(pm, n_2, :se_ne, i)
se_1 = _PM.var(pm, n_1, :se_ne, i)
z = _PM.var(pm, n_2, :z_strg_ne, i)
JuMP.@constraint(pm.model, se_2 == ((1-self_discharge_rate)^time_elapsed)*se_1 + time_elapsed*(charge_eff*sc_2 - sd_2/discharge_eff + inflow * z - outflow * z))
end
function constraint_storage_state_final(pm::_PM.AbstractPowerModel, n::Int, i::Int, energy)
se = _PM.var(pm, n, :se, i)
JuMP.@constraint(pm.model, se >= energy)
end
function constraint_storage_state_final_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int, energy)
se = _PM.var(pm, n, :se_ne, i)
z = _PM.var(pm, n, :z_strg_ne, i)
JuMP.@constraint(pm.model, se >= energy * z)
end
function constraint_maximum_absorption_initial(pm::_PM.AbstractPowerModel, n::Int, i::Int, time_elapsed)
sc = _PM.var(pm, n, :sc, i)
e_abs = _PM.var(pm, n, :e_abs, i)
JuMP.@constraint(pm.model, e_abs == time_elapsed * sc)
end
function constraint_maximum_absorption_initial_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int, time_elapsed)
sc = _PM.var(pm, n, :sc_ne, i)
e_abs = _PM.var(pm, n, :e_abs_ne, i)
JuMP.@constraint(pm.model, e_abs == time_elapsed * sc)
end
function constraint_maximum_absorption(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, time_elapsed)
sc_2 = _PM.var(pm, n_2, :sc, i)
e_abs_2 = _PM.var(pm, n_2, :e_abs, i)
e_abs_1 = _PM.var(pm, n_1, :e_abs, i)
JuMP.@constraint(pm.model, e_abs_2 - e_abs_1 == time_elapsed * sc_2)
end
function constraint_maximum_absorption_ne(pm::_PM.AbstractPowerModel, n_1::Int, n_2::Int, i::Int, time_elapsed)
sc_2 = _PM.var(pm, n_2, :sc_ne, i)
e_abs_2 = _PM.var(pm, n_2, :e_abs_ne, i)
e_abs_1 = _PM.var(pm, n_1, :e_abs_ne, i)
JuMP.@constraint(pm.model, e_abs_2 - e_abs_1 == time_elapsed * sc_2)
end
function constraint_storage_excl_slack(pm::_PM.AbstractPowerModel, n::Int, i::Int)
sc = _PM.var(pm, n, :sc, i)
sd = _PM.var(pm, n, :sd, i)
sc_max = JuMP.upper_bound(sc)
sd_max = JuMP.upper_bound(sd)
s_bound = max(sc_max, sd_max)
JuMP.@constraint(pm.model, sc + sd <= s_bound)
end
function constraint_storage_excl_slack_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
sc_max = JuMP.upper_bound(sc)
sd_max = JuMP.upper_bound(sd)
s_bound = max(sc_max, sd_max)
JuMP.@constraint(pm.model, sc + sd <= s_bound)
end
function constraint_storage_bounds_ne(pm::_PM.AbstractPowerModel, n::Int, i::Int)
se = _PM.var(pm, n, :se_ne, i)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
ps = _PM.var(pm, n, :ps_ne, i)
qs = _PM.var(pm, n, :qs_ne, i)
z = _PM.var(pm, n, :z_strg_ne, i)
se_min = JuMP.lower_bound(se)
se_max = JuMP.upper_bound(se)
sc_min = JuMP.lower_bound(sc)
sc_max = JuMP.upper_bound(sc)
sd_min = JuMP.lower_bound(sd)
sd_max = JuMP.upper_bound(sd)
ps_min = JuMP.lower_bound(ps)
ps_max = JuMP.upper_bound(ps)
qs_min = JuMP.lower_bound(qs)
qs_max = JuMP.upper_bound(qs)
JuMP.@constraint(pm.model, se <= se_max * z)
JuMP.@constraint(pm.model, se >= se_min * z)
JuMP.@constraint(pm.model, sc <= sc_max * z)
JuMP.@constraint(pm.model, sc >= sc_min * z)
JuMP.@constraint(pm.model, sd <= sd_max * z)
JuMP.@constraint(pm.model, sd >= sd_min * z)
JuMP.@constraint(pm.model, ps <= ps_max * z)
JuMP.@constraint(pm.model, ps >= ps_min * z)
JuMP.@constraint(pm.model, qs <= qs_max * z)
JuMP.@constraint(pm.model, qs >= qs_min * z)
end
function constraint_storage_bounds_ne(pm::_PM.AbstractActivePowerModel, n::Int, i::Int)
se = _PM.var(pm, n, :se_ne, i)
sc = _PM.var(pm, n, :sc_ne, i)
sd = _PM.var(pm, n, :sd_ne, i)
ps = _PM.var(pm, n, :ps_ne, i)
z = _PM.var(pm, n, :z_strg_ne, i)
se_min = JuMP.lower_bound(se)
se_max = JuMP.upper_bound(se)
sc_min = JuMP.lower_bound(sc)
sc_max = JuMP.upper_bound(sc)
sd_min = JuMP.lower_bound(sd)
sd_max = JuMP.upper_bound(sd)
ps_min = JuMP.lower_bound(ps)
ps_max = JuMP.upper_bound(ps)
JuMP.@constraint(pm.model, se <= se_max * z)
JuMP.@constraint(pm.model, se >= se_min * z)
JuMP.@constraint(pm.model, sc <= sc_max * z)
JuMP.@constraint(pm.model, sc >= sc_min * z)
JuMP.@constraint(pm.model, sd <= sd_max * z)
JuMP.@constraint(pm.model, sd >= sd_min * z)
JuMP.@constraint(pm.model, ps <= ps_max * z)
JuMP.@constraint(pm.model, ps >= ps_min * z)
end
function constraint_ne_storage_activation(pm::_PM.AbstractPowerModel, n::Int, i::Int, horizon::Vector{Int})
indicator = _PM.var(pm, n, :z_strg_ne, i)
investments = _PM.var.(Ref(pm), horizon, :z_strg_ne_investment, i)
JuMP.@constraint(pm.model, indicator == sum(investments))
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 15939 | # Transmission & distribution coupling functions
## Functions replacing PowerModels or InfrastructureModels functions
""
function solve_model(
t_data::Dict{String,Any},
d_data::Vector{Dict{String,Any}},
t_model_type::Type{<:_PM.AbstractPowerModel},
d_model_type::Type{<:_PM.AbstractPowerModel},
optimizer::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
build_method::Function;
t_ref_extensions::Vector{<:Function} = Function[],
d_ref_extensions::Vector{<:Function} = Function[],
t_solution_processors::Vector{<:Function} = Function[],
d_solution_processors::Vector{<:Function} = Function[],
t_setting::Dict{String,<:Any} = Dict{String,Any}(),
d_setting::Dict{String,<:Any} = Dict{String,Any}(),
direct_model = false,
kwargs...
)
start_time = time()
number_of_nws = dim_length(t_data)
number_of_dist_networks = length(d_data)
# Check that transmission and distribution network ids are the same
nw_id_set = Set(id for (id,nw) in t_data["nw"])
for data in d_data
if Set(id for (id,nw) in data["nw"]) ≠ nw_id_set
Memento.error(_LOGGER, "Networks in transmission and distribution data dictionaries must have the same IDs.")
end
end
t_data = deepcopy(t_data)
# Merge distribution networks
d_data_merged = deepcopy(first(d_data))
shift_nws!(d_data_merged)
add_dimension!(d_data_merged, :sub_nw, Dict(1 => Dict{String,Any}("t_bus"=>d_data_merged["t_bus"])))
delete!(d_data_merged, "t_bus")
for data in Iterators.drop(d_data, 1)
data = deepcopy(data)
shift_nws!(data, dim_length(d_data_merged)+number_of_nws)
add_dimension!(data, :sub_nw, Dict(1 => Dict{String,Any}("t_bus"=>data["t_bus"])))
delete!(data, "t_bus")
merge_multinetworks!(d_data_merged, data, :sub_nw)
end
t_gens = _add_td_coupling_generators!(t_data, d_data_merged)
# Instantiate models
start_time_instantiate = time()
if direct_model
t_pm, d_pm = instantiate_model(t_data, d_data_merged, t_model_type, d_model_type, build_method; t_ref_extensions, d_ref_extensions, t_setting, d_setting, jump_model=JuMP.direct_model(optimizer), kwargs...)
else
t_pm, d_pm = instantiate_model(t_data, d_data_merged, t_model_type, d_model_type, build_method; t_ref_extensions, d_ref_extensions, t_setting, d_setting, kwargs...)
end
Memento.debug(_LOGGER, "combined T&D model build time: $(time() - start_time_instantiate)")
start_time_optimize = time()
# Solve the optimization model and store the transmission result.
t_result = _IM.optimize_model!(t_pm; optimizer, solution_processors=t_solution_processors)
# Build the distribution result using the same model as above.
d_result = _IM.build_result(d_pm, t_result["solve_time"]; solution_processors=d_solution_processors)
# The asymmetric code above for building results produces inaccurate debugging messages;
# this behavior can be fixed by writing a custom optimize_model!() that takes 2 models.
# Remove coupling generators from transmission solution.
if haskey(t_result["solution"], "nw") # It only happens if the problem is solved to optimality.
for nw in values(t_result["solution"]["nw"])
for t_gen in t_gens
delete!(nw["gen"], string(t_gen))
end
end
end
# Subdivide distribution result.
if haskey(d_result["solution"], "nw") # It only happens if the problem is solved to optimality.
d_nw_merged = d_result["solution"]["nw"]
d_sol = Vector{Dict{String,Any}}(undef,number_of_dist_networks)
d_sol_template = filter(pair->pair.first≠"nw", d_result["solution"])
for s in 1:number_of_dist_networks
d_sol[s] = copy(d_sol_template)
nw = d_sol[s]["nw"] = Dict{String,Any}()
for n in nw_ids(t_data)
nw["$n"] = d_nw_merged["$(s*number_of_nws+n)"]
end
end
else
d_sol = Dict{String,Any}()
end
Memento.debug(_LOGGER, "combined T&D model solution time: $(time() - start_time_optimize)")
result = t_result
result["t_solution"] = t_result["solution"]
delete!(result, "solution")
result["d_solution"] = d_sol
result["solve_time"] = time()-start_time
return result
end
""
function instantiate_model(
t_data::Dict{String,Any},
d_data::Dict{String,Any},
t_model_type::Type{<:_PM.AbstractPowerModel},
d_model_type::Type{<:_PM.AbstractPowerModel},
build_method::Function;
t_ref_extensions::Vector{<:Function} = Function[],
d_ref_extensions::Vector{<:Function} = Function[],
t_setting::Dict{String,<:Any} = Dict{String,Any}(),
d_setting::Dict{String,<:Any} = Dict{String,Any}(),
kwargs...
)
# Instantiate the transmission PowerModels struct, without building the model.
t_pm = _PM.instantiate_model(t_data, t_model_type, method->nothing; ref_extensions=t_ref_extensions, setting=t_setting, kwargs...)
# Instantiate the distribution PowerModels struct, without building the model.
# Distribution and transmission structs share the same JuMP model. The `jump_model` parameter is used by _IM.InitializeInfrastructureModel().
# `jump_model` comes after `kwargs...` to take precedence in cases where it is also defined in `kwargs...`.
d_pm = _PM.instantiate_model(d_data, d_model_type, method->nothing; ref_extensions=d_ref_extensions, setting=d_setting, kwargs..., jump_model=t_pm.model)
# Build the combined model.
build_method(t_pm, d_pm)
return t_pm, d_pm
end
## Functions that manipulate data structures
"""
_add_td_coupling_generators!(t_data, d_data)
Add and set T&D coupling generators.
For each network `n` in `d_data`:
- set the cost of distribution coupling generator to 0;
- add the transmission coupling generator;
- add to `dim_prop(d_data, `n`, :sub_nw)` the following entries:
- `d_gen`: the id of the distribution coupling generator;
- `t_gen`: the id of the transmission coupling generator.
Return a vector containing the ids of `t_gen`.
# Prerequisites
- Networks in `d_data` have exactly 1 reference bus 1 generator connected to it.
- A dimension `sub_nw` is defined for `d_data` and a property `t_bus` defines the id of the
transmission network bus where the distribution network is attached.
"""
function _add_td_coupling_generators!(t_data::Dict{String,Any}, d_data::Dict{String,Any})
t_nw_ids = nw_ids(t_data)
if !_check_constant_number_of_generators(t_data, t_nw_ids)
Memento.error(_LOGGER, "The number of generators in transmission network is not constant.")
end
number_of_distribution_networks = dim_length(d_data, :sub_nw)
t_gens = Vector{Int}(undef, number_of_distribution_networks)
for s in 1:number_of_distribution_networks
d_nw_ids = nw_ids(d_data; sub_nw=s)
if !_check_constant_number_of_generators(d_data, d_nw_ids)
Memento.error(_LOGGER, "The number of generators in distribution network $s is not constant.")
end
sub_nw = dim_prop(d_data, :sub_nw, s)
# Get distribution coupling generator id and store it in `sub_nw` properties
d_gen_idx = sub_nw["d_gen"] = _get_reference_gen(d_data, s)
t_bus = sub_nw["t_bus"]
# Compute transmission generator id
t_gen_idx = length(first(values(t_data["nw"]))["gen"]) + 1 # Assumes that gens have contiguous indices starting from 1, as should be
sub_nw["t_gen"] = t_gen_idx
t_gens[s] = t_gen_idx
for (t_n, d_n) in zip(t_nw_ids, d_nw_ids)
t_nw = t_data["nw"]["$t_n"]
d_nw = d_data["nw"]["$d_n"]
# Set distribution coupling generator parameters
d_gen = d_nw["gen"]["$d_gen_idx"]
d_gen["dispatchable"] = true
d_gen["model"] = 2 # Cost model (2 => polynomial cost)
d_gen["ncost"] = 0 # Number of cost coefficients
d_gen["cost"] = Any[]
# Check that t_bus exists
if !haskey(t_nw["bus"], "$t_bus")
Memento.error(_LOGGER, "Bus $t_bus does not exist in nw $t_n of transmission network data.")
end
# Add transmission coupling generator
mva_base_ratio = d_nw["baseMVA"] / t_nw["baseMVA"]
t_gen = t_nw["gen"]["$t_gen_idx"] = Dict{String,Any}(
"gen_bus" => t_bus,
"index" => t_gen_idx,
"dispatchable" => true,
"pmin" => -d_gen["pmax"] * mva_base_ratio,
"pmax" => -d_gen["pmin"] * mva_base_ratio,
"gen_status" => 1,
"model" => 2, # Cost model (2 => polynomial cost)
"ncost" => 0, # Number of cost coefficients
"cost" => Any[]
)
if haskey(d_gen, "qmax")
t_gen["qmin"] = -d_gen["qmax"] * mva_base_ratio
t_gen["qmax"] = -d_gen["qmin"] * mva_base_ratio
end
end
end
return t_gens
end
## Utility functions
function _check_constant_number_of_generators(data::Dict{String,Any}, nws::Vector{Int})
data_nw = data["nw"]
first_n, rest = Iterators.peel(nws)
first_n_gen_length = length(data_nw["$first_n"]["gen"])
for n in rest
if length(data_nw["$n"]["gen"]) ≠ first_n_gen_length
return false
end
end
return true
end
function _get_reference_gen(data::Dict{String,Any}, s::Int=1)
nws = nw_ids(data; sub_nw=s)
first_nw = data["nw"][ string(first(nws)) ]
# Get the id of the only reference bus
ref_buses = [b for (b,bus) in first_nw["bus"] if bus["bus_type"] == 3]
if length(ref_buses) != 1
Memento.error(_LOGGER, "Distribution network must have 1 reference bus, but $(length(ref_buses)) are present.")
end
ref_bus = parse(Int, first(ref_buses))
# Get the id of the only generator connected to the reference bus
ref_gens = [g for (g,gen) in first_nw["gen"] if gen["gen_bus"] == ref_bus]
if length(ref_gens) ≠ 1
Memento.error(_LOGGER, "Distribution network must have 1 generator connected to reference bus, but $(length(ref_gens)) are present.")
end
return parse(Int, first(ref_gens))
end
## Solution processors
"""
sol_td_coupling!(pm, solution)
Add T&D coupling data to `solution` and remove the fake generator from `solution`.
Report in `solution["td_coupling"]["p"]` and `solution["td_coupling"]["q"]` the active and
reactive power that distribution network `pm` exchanges with the transmission network
(positive if from transmission to distribution) in units of `baseMVA` of `pm`.
Delete from `solution` the generator representing the transmission network, so that only the
actual generators remain in `solution["gen"]`.
"""
function sol_td_coupling!(pm::_PM.AbstractBFModel, solution::Dict{String,Any})
solution = _PM.get_pm_data(solution)
if haskey(solution, "nw")
nws_sol = solution["nw"]
else
nws_sol = Dict("0" => solution)
end
for (nw, nw_sol) in nws_sol
n = parse(Int, nw)
if !(haskey(dim_prop(pm), :sub_nw) && haskey(dim_prop(pm, n, :sub_nw), "d_gen"))
Memento.error(_LOGGER, "T&D coupling data is missing from the model of nw $nw.")
end
d_gen = string(dim_prop(pm, n, :sub_nw, "d_gen"))
nw_sol["td_coupling"] = Dict{String,Any}()
nw_sol["td_coupling"]["p"] = nw_sol["gen"][d_gen]["pg"]
nw_sol["td_coupling"]["q"] = nw_sol["gen"][d_gen]["qg"]
delete!(nw_sol["gen"], d_gen)
end
end
## Functions that group constraint templates, provided for convenience
"""
Connect each distribution nw to the corresponding transmission nw and apply coupling constraints.
The coupling constraint is applied to the two generators that each distribution nw indicates.
"""
function constraint_td_coupling(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel)
t_nws = nw_ids(t_pm)
for s in keys(dim_prop(d_pm, :sub_nw))
d_nws = nw_ids(d_pm; sub_nw = s)
for i in eachindex(t_nws)
t_nw = t_nws[i]
d_nw = d_nws[i]
constraint_td_coupling_power_balance(t_pm, d_pm, t_nw, d_nw)
end
end
end
## Constraint templates
"""
State the power conservation between a distribution nw and the corresponding transmission nw.
"""
function constraint_td_coupling_power_balance(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel, t_nw::Int, d_nw::Int)
sub_nw = dim_prop(d_pm, d_nw, :sub_nw)
t_gen = sub_nw["t_gen"]
d_gen = sub_nw["d_gen"]
t_mva_base = _PM.ref(t_pm, t_nw, :baseMVA)
d_mva_base = _PM.ref(d_pm, d_nw, :baseMVA)
constraint_td_coupling_power_balance_active(t_pm, d_pm, t_nw, d_nw, t_gen, d_gen, t_mva_base, d_mva_base)
constraint_td_coupling_power_balance_reactive(t_pm, d_pm, t_nw, d_nw, t_gen, d_gen, t_mva_base, d_mva_base)
end
"""
Apply bounds on reactive power exchange at the point of common coupling (PCC) of a distribution nw, as allowable fraction of rated apparent power.
"""
function constraint_td_coupling_power_reactive_bounds(d_pm::_PM.AbstractBFModel, qs_ratio_bound::Real; nw::Int=_PM.nw_id_default)
sub_nw = dim_prop(d_pm, nw, :sub_nw)
d_gen = sub_nw["d_gen"]
constraint_td_coupling_power_reactive_bounds(d_pm, nw, d_gen, qs_ratio_bound)
end
## Constraint implementations
""
function constraint_td_coupling_power_balance_active(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel, t_nw::Int, d_nw::Int, t_gen::Int, d_gen::Int, t_mva_base::Real, d_mva_base::Real)
t_p_in = _PM.var(t_pm, t_nw, :pg, t_gen)
d_p_in = _PM.var(d_pm, d_nw, :pg, d_gen)
JuMP.@constraint(t_pm.model, t_mva_base*t_p_in + d_mva_base*d_p_in == 0.0) # t_pm.model == d_pm.model
end
""
function constraint_td_coupling_power_balance_reactive(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel, t_nw::Int, d_nw::Int, t_gen::Int, d_gen::Int, t_mva_base::Real, d_mva_base::Real)
t_q_in = _PM.var(t_pm, t_nw, :qg, t_gen)
d_q_in = _PM.var(d_pm, d_nw, :qg, d_gen)
JuMP.@constraint(t_pm.model, t_mva_base*t_q_in + d_mva_base*d_q_in == 0.0) # t_pm.model == d_pm.model
end
"Nothing to do because the transmission network model does not support reactive power."
function constraint_td_coupling_power_balance_reactive(t_pm::_PM.AbstractActivePowerModel, d_pm::_PM.AbstractBFModel, t_nw::Int, d_nw::Int, t_gen::Int, d_gen::Int, t_mva_base::Real, d_mva_base::Real)
end
""
function constraint_td_coupling_power_reactive_bounds(d_pm::_PM.AbstractBFModel, d_nw::Int, d_gen::Int, qs_ratio_bound::Real)
# Compute the rated apparent power of the distribution network, based on the rated power of
# existing and candidate branches connected to its reference bus. This value depends on the
# indicator variables of both existing (if applicable) and candidate branches (i.e. whether they
# are built or not).
if haskey(_PM.var(d_pm, d_nw), :z_branch) # Some `branch`es can be replaced by `ne_branch`es
z_branch = _PM.var(d_pm, d_nw, :z_branch)
s_rate = (
sum(branch["rate_a"] * get(z_branch, b, 1.0) for (b,branch) in _PM.ref(d_pm, d_nw, :frb_branch))
+ sum(branch["rate_a"] * _PM.var(d_pm, d_nw, :branch_ne, b) for (b,branch) in _PM.ref(d_pm, d_nw, :frb_ne_branch))
)
else # No `ne_branch`es at all
s_rate = sum(branch["rate_a"] for (b,branch) in _PM.ref(d_pm, d_nw, :frb_branch))
end
q = _PM.var(d_pm, d_nw, :qg, d_gen) # Exchanged reactive power (positive if from T to D)
JuMP.@constraint(d_pm.model, q <= qs_ratio_bound*s_rate)
JuMP.@constraint(d_pm.model, q >= -qs_ratio_bound*s_rate)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 869 | # Extends PowerModels/src/core/types.jl
##### Linear Approximations #####
"""
Linearized AC branch flow model for radial networks.
Variables:
- squared voltage magnitude;
- branch active power;
- branch reactive power.
Properties:
- same voltage angle for all buses;
- lossless.
Differences with respect to `BFAPowerModel`:
- shunt admittances of the branches are neglected;
- the complex power in the thermal limit constraints of the branches is limited by an octagon
instead of a circle, so as to keep the model linear.
"""
# Using `@im_fields` instead of `@pm_fields` because the latter requires to be explicitly
# qualified (i.e. prepend `PowerModels.` instead of `_PM.`). The two macros are equal at the
# moment, but this may need to be changed if they will differ at some point.
mutable struct BFARadPowerModel <: _PM.AbstractBFAModel _PM.@im_fields end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 2246 | # To be used instead of _PM.variable_ne_branch_indicator() - supports deduplication of variables
function variable_ne_branch_indicator(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
z_branch_ne = _PM.var(pm, nw)[:branch_ne] = JuMP.@variable(pm.model,
[l in _PM.ids(pm, nw, :ne_branch)], base_name="$(nw)_branch_ne",
binary = true,
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_branch, l), "branch_tnep_start", 1.0)
)
else
z_branch_ne = _PM.var(pm, nw)[:branch_ne] = JuMP.@variable(pm.model,
[l in _PM.ids(pm, nw, :ne_branch)], base_name="$(nw)_branch_ne",
lower_bound = 0.0,
upper_bound = 1.0,
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_branch, l), "branch_tnep_start", 1.0)
)
end
else
z_branch_ne = _PM.var(pm, nw)[:branch_ne] = _PM.var(pm, first_n)[:branch_ne]
end
report && _PM.sol_component_value(pm, nw, :ne_branch, :built, _PM.ids(pm, nw, :ne_branch), z_branch_ne)
end
function variable_ne_branch_investment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
investment = _PM.var(pm, nw)[:branch_ne_investment] = JuMP.@variable(pm.model,
[l in _PM.ids(pm, nw, :ne_branch)], base_name="$(nw)_branch_ne_investment",
binary = true,
start = 0
)
else
investment = _PM.var(pm, nw)[:branch_ne_investment] = JuMP.@variable(pm.model,
[l in _PM.ids(pm, nw, :ne_branch)], base_name="$(nw)_branch_ne_investment",
lower_bound = 0.0,
upper_bound = 1.0,
start = 0
)
end
else
investment = _PM.var(pm, nw)[:branch_ne_investment] = _PM.var(pm, first_n)[:branch_ne_investment]
end
report && _PM.sol_component_value(pm, nw, :ne_branch, :investment, _PM.ids(pm, nw, :ne_branch), investment)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3310 | # To be used instead of _PMACDC.variable_converter_ne() - supports deduplication of variables
function variable_ne_converter_indicator(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
Z_dc_conv_ne = _PM.var(pm, nw)[:conv_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :convdc_ne)], base_name="$(nw)_conv_ne",
binary = true,
start = _PM.comp_start_value(_PM.ref(pm, nw, :convdc_ne, i), "branchdc_tnep_start", 1.0)
)
else
Z_dc_conv_ne = _PM.var(pm, nw)[:conv_ne] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :convdc_ne)], base_name="$(nw)_conv_ne",
lower_bound = 0,
upper_bound = 1,
start = _PM.comp_start_value(_PM.ref(pm, nw, :convdc_ne, i), "branchdc_tnep_start", 1.0)
)
end
else
Z_dc_conv_ne = _PM.var(pm, nw)[:conv_ne] = _PM.var(pm, first_n)[:conv_ne]
end
report && _PM.sol_component_value(pm, nw, :convdc_ne, :isbuilt, _PM.ids(pm, nw, :convdc_ne), Z_dc_conv_ne)
end
function variable_ne_converter_investment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
investment = _PM.var(pm, nw)[:conv_ne_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :convdc_ne)], base_name="$(nw)_conv_ne_investment",
binary = true,
start = 0
)
else
investment = _PM.var(pm, nw)[:conv_ne_investment] = JuMP.@variable(pm.model,
[i in _PM.ids(pm, nw, :convdc_ne)], base_name="$(nw)_conv_ne_investment",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
investment = _PM.var(pm, nw)[:conv_ne_investment] = _PM.var(pm, first_n)[:conv_ne_investment]
end
report && _PM.sol_component_value(pm, nw, :convdc_ne, :investment, _PM.ids(pm, nw, :convdc_ne), investment)
end
# To be used instead of _PMACDC.variable_dc_converter_ne() - supports deduplication of variables
function variable_dc_converter_ne(pm::_PM.AbstractPowerModel; investment::Bool=true, kwargs...)
_PMACDC.variable_conv_tranformer_flow_ne(pm; kwargs...)
_PMACDC.variable_conv_reactor_flow_ne(pm; kwargs...)
variable_ne_converter_indicator(pm; kwargs..., relax=true) # FlexPlan version: replaces _PMACDC.variable_converter_ne().
investment &&variable_ne_converter_investment(pm; kwargs...)
_PMACDC.variable_converter_active_power_ne(pm; kwargs...)
_PMACDC.variable_converter_reactive_power_ne(pm; kwargs...)
_PMACDC.variable_acside_current_ne(pm; kwargs...)
_PMACDC.variable_dcside_power_ne(pm; kwargs...)
# _PMACDC.variable_converter_firing_angle_ne(pm; kwargs...)
_PMACDC.variable_converter_filter_voltage_ne(pm; kwargs...)
_PMACDC.variable_converter_internal_voltage_ne(pm; kwargs...)
#
_PMACDC.variable_converter_to_grid_active_power_ne(pm; kwargs...)
_PMACDC.variable_converter_to_grid_reactive_power_ne(pm; kwargs...)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 2592 | # To be used instead of _PMACDC.variable_branch_ne() - supports deduplication of variables
function variable_ne_branchdc_indicator(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
Z_dc_branch_ne = _PM.var(pm, nw)[:branchdc_ne] = JuMP.@variable(pm.model, #branch_ne is also name in PowerModels, branchdc_ne is candidate branches
[l in _PM.ids(pm, nw, :branchdc_ne)], base_name="$(nw)_branchdc_ne",
binary = true,
start = _PM.comp_start_value(_PM.ref(pm, nw, :branchdc_ne, l), "convdc_tnep_start", 0.0)
)
else
Z_dc_branch_ne = _PM.var(pm, nw)[:branchdc_ne] = JuMP.@variable(pm.model, #branch_ne is also name in PowerModels, branchdc_ne is candidate branches
[l in _PM.ids(pm, nw, :branchdc_ne)], base_name="$(nw)_branchdc_ne",
lower_bound = 0,
upper_bound = 1,
start = _PM.comp_start_value(_PM.ref(pm, nw, :branchdc_ne, l), "convdc_tnep_start", 0.0)
)
end
else
Z_dc_branch_ne = _PM.var(pm, nw)[:branchdc_ne] = _PM.var(pm, first_n)[:branchdc_ne]
end
report && _PM.sol_component_value(pm, nw, :branchdc_ne, :isbuilt, _PM.ids(pm, nw, :branchdc_ne), Z_dc_branch_ne)
end
function variable_ne_branchdc_investment(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default, relax::Bool=false, report::Bool=true)
first_n = first_id(pm, nw, :hour, :scenario)
if nw == first_n
if !relax
investment = _PM.var(pm, nw)[:branchdc_ne_investment] = JuMP.@variable(pm.model, #branch_ne is also name in PowerModels, branchdc_ne is candidate branches
[l in _PM.ids(pm, nw, :branchdc_ne)], base_name="$(nw)_branchdc_ne_investment",
binary = true,
start = 0
)
else
investment = _PM.var(pm, nw)[:branchdc_ne_investment] = JuMP.@variable(pm.model, #branch_ne is also name in PowerModels, branchdc_ne is candidate branches
[l in _PM.ids(pm, nw, :branchdc_ne)], base_name="$(nw)_branchdc_ne_investment",
lower_bound = 0,
upper_bound = 1,
start = 0
)
end
else
investment = _PM.var(pm, nw)[:branchdc_ne_investment] = _PM.var(pm, first_n)[:branchdc_ne_investment]
end
report && _PM.sol_component_value(pm, nw, :branchdc_ne, :investment, _PM.ids(pm, nw, :branchdc_ne), investment)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3071 | # Extends PowerModels/src/form/bf.jl
## Variables
""
function variable_ne_branch_current(pm::_PM.AbstractBFModel; kwargs...)
variable_ne_buspair_current_magnitude_sqr(pm; kwargs...)
end
""
function variable_ne_buspair_current_magnitude_sqr(pm::_PM.AbstractBFAModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
end
## Constraint templates
"""
This constraint captures problem agnostic constraints that are used to link
the model's current variables together, in addition to the standard problem
formulation constraints. The network expansion name (ne) indicates that the
currents in this constraint can be set to zero via an indicator variable.
"""
function constraint_ne_model_current(pm::_PM.AbstractPowerModel; nw::Int=_PM.nw_id_default)
constraint_ne_model_current(pm, nw)
end
""
function constraint_ne_power_losses(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
tm = branch["tap"]
g_sh_fr = branch["g_fr"]
g_sh_to = branch["g_to"]
b_sh_fr = branch["b_fr"]
b_sh_to = branch["b_to"]
vad_min = _PM.ref(pm, nw, :off_angmin)
vad_max = _PM.ref(pm, nw, :off_angmax)
constraint_ne_power_losses(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
"""
Defines voltage drop over a branch, linking from and to side voltage magnitude
"""
function constraint_ne_voltage_magnitude_difference(pm::_PM.AbstractPowerModel, i::Int; nw::Int=_PM.nw_id_default)
branch = _PM.ref(pm, nw, :ne_branch, i)
f_bus = branch["f_bus"]
t_bus = branch["t_bus"]
f_idx = (i, f_bus, t_bus)
t_idx = (i, t_bus, f_bus)
r = branch["br_r"]
x = branch["br_x"]
g_sh_fr = branch["g_fr"]
b_sh_fr = branch["b_fr"]
tm = branch["tap"]
constraint_ne_voltage_magnitude_difference(pm, nw, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
end
## Actual constraints
"""
do nothing, most models do not require any model-specific network expansion current constraints
"""
function constraint_ne_model_current(pm::_PM.AbstractPowerModel, n::Int)
end
""
function constraint_ne_voltage_magnitude_difference(pm::_PM.AbstractBFAModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
branch = _PM.ref(pm, n, :ne_branch, i)
fr_bus = _PM.ref(pm, n, :bus, f_bus)
to_bus = _PM.ref(pm, n, :bus, t_bus)
M_hi = fr_bus["vmax"]^2/tm^2 - to_bus["vmin"]^2
M_lo = -fr_bus["vmin"]^2/tm^2 + to_bus["vmax"]^2
p_fr = _PM.var(pm, n, :p_ne, f_idx)
q_fr = _PM.var(pm, n, :q_ne, f_idx)
w_fr = _PM.var(pm, n, :w, f_bus)
w_to = _PM.var(pm, n, :w, t_bus)
z = _PM.var(pm, n, :branch_ne, i)
JuMP.@constraint(pm.model, (w_fr/tm^2) - w_to <= 2*(r*p_fr + x*q_fr) + M_hi*(1-z) )
JuMP.@constraint(pm.model, (w_fr/tm^2) - w_to >= 2*(r*p_fr + x*q_fr) - M_lo*(1-z) )
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 17098 | # Linearized AC branch flow model for radial networks.
# Variables: squared voltage magnitude, active power, reactive power.
## Variables
""
function _PM.variable_bus_voltage(pm::BFARadPowerModel; kwargs...)
_PM.variable_bus_voltage_magnitude_sqr(pm; kwargs...)
_PM.variable_bus_voltage_angle(pm; kwargs...)
end
"Voltage angle of all buses is that of the reference bus"
function _PM.variable_bus_voltage_angle(pm::BFARadPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
report && _PM.sol_component_fixed(pm, nw, :bus, :va, _PM.ids(pm, nw, :bus), last(first(_PM.ref(pm,nw,:ref_buses)))["va"])
end
# Copied from _PM.variable_branch_power_real(pm::AbstractAPLossLessModels; nw::Int, bounded::Bool, report::Bool)
# Since this model is lossless, active power variables are 1 per branch instead of 2.
""
function _PM.variable_branch_power_real(pm::BFARadPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
p = _PM.var(pm, nw)[:p] = JuMP.@variable(pm.model,
[(l,i,j) in _PM.ref(pm, nw, :arcs_from)], base_name="$(nw)_p",
start = _PM.comp_start_value(_PM.ref(pm, nw, :branch, l), "p_start")
)
if bounded
flow_lb, flow_ub = _PM.ref_calc_branch_flow_bounds(_PM.ref(pm, nw, :branch), _PM.ref(pm, nw, :bus))
for arc in _PM.ref(pm, nw, :arcs_from)
l,i,j = arc
if !isinf(flow_lb[l])
JuMP.set_lower_bound(p[arc], flow_lb[l])
end
if !isinf(flow_ub[l])
JuMP.set_upper_bound(p[arc], flow_ub[l])
end
end
end
for (l,branch) in _PM.ref(pm, nw, :branch)
if haskey(branch, "pf_start")
f_idx = (l, branch["f_bus"], branch["t_bus"])
JuMP.set_start_value(p[f_idx], branch["pf_start"])
end
end
# this explicit type erasure is necessary
p_expr = Dict{Any,Any}( ((l,i,j), p[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :arcs_from) )
p_expr = merge(p_expr, Dict( ((l,j,i), -1.0*p[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :arcs_from)))
_PM.var(pm, nw)[:p] = p_expr
report && _PM.sol_component_value_edge(pm, nw, :branch, :pf, :pt, _PM.ref(pm, nw, :arcs_from), _PM.ref(pm, nw, :arcs_to), p_expr)
end
# Copied from _PM.variable_ne_branch_power_real(pm::AbstractAPLossLessModels; nw::Int, bounded::Bool, report::Bool)
# and improved by comparing with _PM.variable_branch_power_real(pm::AbstractAPLossLessModels; nw::Int, bounded::Bool, report::Bool).
# Since this model is lossless, active power variables are 1 per branch instead of 2.
""
function _PM.variable_ne_branch_power_real(pm::BFARadPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
p_ne = _PM.var(pm, nw)[:p_ne] = JuMP.@variable(pm.model,
[(l,i,j) in _PM.ref(pm, nw, :ne_arcs_from)], base_name="$(nw)_p_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_branch, l), "p_start")
)
if bounded
flow_lb, flow_ub = _PM.ref_calc_branch_flow_bounds(_PM.ref(pm, nw, :ne_branch), _PM.ref(pm, nw, :bus))
for arc in _PM.ref(pm, nw, :ne_arcs_from)
l,i,j = arc
if !isinf(flow_lb[l])
JuMP.set_lower_bound(p_ne[arc], flow_lb[l])
end
if !isinf(flow_ub[l])
JuMP.set_upper_bound(p_ne[arc], flow_ub[l])
end
end
end
# this explicit type erasure is necessary
p_ne_expr = Dict{Any,Any}( ((l,i,j), p_ne[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :ne_arcs_from) )
p_ne_expr = merge(p_ne_expr, Dict(((l,j,i), -1.0*p_ne[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :ne_arcs_from)))
_PM.var(pm, nw)[:p_ne] = p_ne_expr
report && _PM.sol_component_value_edge(pm, nw, :ne_branch, :pf, :pt, _PM.ref(pm, nw, :ne_arcs_from), _PM.ref(pm, nw, :ne_arcs_to), p_ne_expr)
end
# Adapted from variable_branch_power_real(pm::BFARadPowerModel; nw::Int, bounded::Bool, report::Bool)
""
function _PM.variable_branch_power_imaginary(pm::BFARadPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
q = _PM.var(pm, nw)[:q] = JuMP.@variable(pm.model,
[(l,i,j) in _PM.ref(pm, nw, :arcs_from)], base_name="$(nw)_q",
start = _PM.comp_start_value(_PM.ref(pm, nw, :branch, l), "q_start")
)
if bounded
flow_lb, flow_ub = _PM.ref_calc_branch_flow_bounds(_PM.ref(pm, nw, :branch), _PM.ref(pm, nw, :bus))
for arc in _PM.ref(pm, nw, :arcs_from)
l,i,j = arc
if !isinf(flow_lb[l])
JuMP.set_lower_bound(q[arc], flow_lb[l])
end
if !isinf(flow_ub[l])
JuMP.set_upper_bound(q[arc], flow_ub[l])
end
end
end
for (l,branch) in _PM.ref(pm, nw, :branch)
if haskey(branch, "qf_start")
f_idx = (l, branch["f_bus"], branch["t_bus"])
JuMP.set_start_value(q[f_idx], branch["qf_start"])
end
end
# this explicit type erasure is necessary
q_expr = Dict{Any,Any}( ((l,i,j), q[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :arcs_from) )
q_expr = merge(q_expr, Dict( ((l,j,i), -1.0*q[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :arcs_from)))
_PM.var(pm, nw)[:q] = q_expr
report && _PM.sol_component_value_edge(pm, nw, :branch, :qf, :qt, _PM.ref(pm, nw, :arcs_from), _PM.ref(pm, nw, :arcs_to), q_expr)
end
# Adapted from variable_ne_branch_power_real(pm::BFARadPowerModel; nw::Int, bounded::Bool, report::Bool)
""
function _PM.variable_ne_branch_power_imaginary(pm::BFARadPowerModel; nw::Int=_PM.nw_id_default, bounded::Bool=true, report::Bool=true)
q_ne = _PM.var(pm, nw)[:q_ne] = JuMP.@variable(pm.model,
[(l,i,j) in _PM.ref(pm, nw, :ne_arcs_from)], base_name="$(nw)_q_ne",
start = _PM.comp_start_value(_PM.ref(pm, nw, :ne_branch, l), "q_start")
)
if bounded
flow_lb, flow_ub = _PM.ref_calc_branch_flow_bounds(_PM.ref(pm, nw, :ne_branch), _PM.ref(pm, nw, :bus))
for arc in _PM.ref(pm, nw, :ne_arcs_from)
l,i,j = arc
if !isinf(flow_lb[l])
JuMP.set_lower_bound(q_ne[arc], flow_lb[l])
end
if !isinf(flow_ub[l])
JuMP.set_upper_bound(q_ne[arc], flow_ub[l])
end
end
end
# this explicit type erasure is necessary
q_ne_expr = Dict{Any,Any}( ((l,i,j), q_ne[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :ne_arcs_from) )
q_ne_expr = merge(q_ne_expr, Dict(((l,j,i), -1.0*q_ne[(l,i,j)]) for (l,i,j) in _PM.ref(pm, nw, :ne_arcs_from)))
_PM.var(pm, nw)[:q_ne] = q_ne_expr
report && _PM.sol_component_value_edge(pm, nw, :ne_branch, :qf, :qt, _PM.ref(pm, nw, :ne_arcs_from), _PM.ref(pm, nw, :ne_arcs_to), q_ne_expr)
end
## Constraints
"Nothing to do, this model is lossless"
function _PM.constraint_power_losses(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm)
end
"Nothing to do, this model is lossless"
function constraint_power_losses_on_off(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_power_losses_frb(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm)
end
"Nothing to do, this model is lossless"
function constraint_power_losses_frb_on_off(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_power_losses_oltc(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to)
end
"Nothing to do, this model is lossless"
function constraint_power_losses_oltc_on_off(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_ne_power_losses(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_ne_power_losses_parallel(pm::BFARadPowerModel, n::Int, br_idx_e, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_e, x_e, g_sh_fr_e, g_sh_to_e, b_sh_fr_e, b_sh_to_e, r_c, x_c, g_sh_fr_c, g_sh_to_c, b_sh_fr_c, b_sh_to_c, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_ne_power_losses_frb(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_ne_power_losses_frb_parallel(pm::BFARadPowerModel, n::Int, br_idx_e, br_idx_c, f_bus, t_bus, f_idx_c, t_idx_c, r_e, x_e, g_sh_fr_e, g_sh_to_e, b_sh_fr_e, b_sh_to_e, r_c, x_c, g_sh_fr_c, g_sh_to_c, b_sh_fr_c, b_sh_to_c, tm, vad_min, vad_max)
end
"Nothing to do, this model is lossless"
function constraint_ne_power_losses_oltc(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, g_sh_to, b_sh_fr, b_sh_to, vad_min, vad_max)
end
"Nothing to do, no voltage angle variables"
function _PM.constraint_voltage_angle_difference(pm::BFARadPowerModel, n::Int, f_idx, angmin, angmax)
end
"Nothing to do, no voltage angle variables"
function _PM.constraint_voltage_angle_difference_on_off(pm::BFARadPowerModel, n::Int, f_idx, angmin, angmax, vad_min, vad_max)
end
"Nothing to do, no voltage angle variables"
function _PM.constraint_ne_voltage_angle_difference(pm::BFARadPowerModel, n::Int, f_idx, angmin, angmax, vad_min, vad_max)
end
"Defines voltage drop over a branch whose `f_bus` is the reference bus"
function constraint_voltage_magnitude_difference_frb(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr, tm)
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
w_to = _PM.var(pm, n, :w, t_bus)
# w_fr is assumed equal to 1.0
JuMP.@constraint(pm.model, (1.0/tm^2) - w_to == 2*(r*p_fr + x*q_fr))
end
"Defines voltage drop over a transformer branch that has an OLTC"
function constraint_voltage_magnitude_difference_oltc(pm::BFARadPowerModel, n::Int, i, f_bus, t_bus, f_idx, t_idx, r, x, g_sh_fr, b_sh_fr)
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
ttmi = _PM.var(pm, n, :ttmi, i)
w_to = _PM.var(pm, n, :w, t_bus)
# w_fr is assumed equal to 1.0 to preserve the linearity of the model
JuMP.@constraint(pm.model, 1.0*ttmi - w_to == 2*(r*p_fr + x*q_fr))
end
"Complex power is limited by an octagon instead of a circle, so as to keep the model linear"
function _PM.constraint_thermal_limit_from(pm::BFARadPowerModel, n::Int, f_idx, rate_a)
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
c_perp = cos(π/8) # ~0.92
c_diag = sin(π/8) + cos(π/8) # == cos(π/8) * sqrt(2), ~1.31
JuMP.@constraint(pm.model, p_fr >= -c_perp*rate_a)
JuMP.@constraint(pm.model, p_fr <= c_perp*rate_a)
JuMP.@constraint(pm.model, q_fr >= -c_perp*rate_a)
JuMP.@constraint(pm.model, q_fr <= c_perp*rate_a)
JuMP.@constraint(pm.model, p_fr + q_fr >= -c_diag*rate_a)
JuMP.@constraint(pm.model, p_fr + q_fr <= c_diag*rate_a)
JuMP.@constraint(pm.model, p_fr - q_fr >= -c_diag*rate_a)
JuMP.@constraint(pm.model, p_fr - q_fr <= c_diag*rate_a)
end
"Complex power is limited by an octagon instead of a circle, so as to keep the model linear"
function _PM.constraint_thermal_limit_from_on_off(pm::BFARadPowerModel, n::Int, i, f_idx, rate_a)
p_fr = _PM.var(pm, n, :p, f_idx)
q_fr = _PM.var(pm, n, :q, f_idx)
z = _PM.var(pm, n, :z_branch, i)
c_perp = cos(π/8) # ~0.92
c_diag = sin(π/8) + cos(π/8) # == cos(π/8) * sqrt(2), ~1.31
JuMP.@constraint(pm.model, p_fr >= -c_perp*rate_a*z)
JuMP.@constraint(pm.model, p_fr <= c_perp*rate_a*z)
JuMP.@constraint(pm.model, q_fr >= -c_perp*rate_a*z)
JuMP.@constraint(pm.model, q_fr <= c_perp*rate_a*z)
JuMP.@constraint(pm.model, p_fr + q_fr >= -c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr + q_fr <= c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr - q_fr >= -c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr - q_fr <= c_diag*rate_a*z)
end
"Complex power is limited by an octagon instead of a circle, so as to keep the model linear"
function _PM.constraint_ne_thermal_limit_from(pm::BFARadPowerModel, n::Int, i, f_idx, rate_a)
p_fr = _PM.var(pm, n, :p_ne, f_idx)
q_fr = _PM.var(pm, n, :q_ne, f_idx)
z = _PM.var(pm, n, :branch_ne, i)
c_perp = cos(π/8) # ~0.92
c_diag = sin(π/8) + cos(π/8) # == cos(π/8) * sqrt(2), ~1.31
JuMP.@constraint(pm.model, p_fr >= -c_perp*rate_a*z)
JuMP.@constraint(pm.model, p_fr <= c_perp*rate_a*z)
JuMP.@constraint(pm.model, q_fr >= -c_perp*rate_a*z)
JuMP.@constraint(pm.model, q_fr <= c_perp*rate_a*z)
JuMP.@constraint(pm.model, p_fr + q_fr >= -c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr + q_fr <= c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr - q_fr >= -c_diag*rate_a*z)
JuMP.@constraint(pm.model, p_fr - q_fr <= c_diag*rate_a*z)
end
""
function constraint_ne_thermal_limit_from_parallel(pm::BFARadPowerModel, n::Int, br_idx_e, br_idx_c, f_idx_c, rate_a_e, rate_a_c)
# Suffixes: _e: existing branch; _c: candidate branch; _p: parallel equivalent
branch_e = _PM.ref(pm, n, :branch, br_idx_e)
branch_c = _PM.ref(pm, n, :ne_branch, br_idx_c)
r_e = branch_e["br_r"]
r_c = branch_c["br_r"]
x_e = branch_e["br_x"]
x_c = branch_c["br_x"]
r_p = (r_e*(r_c^2+x_c^2)+r_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
x_p = (x_e*(r_c^2+x_c^2)+x_c*(r_e^2+x_e^2)) / ((r_e+r_c)^2+(x_e+x_c)^2)
rate_a_p = min(rate_a_e*sqrt(r_e^2+x_e^2),rate_a_c*sqrt(r_c^2+x_c^2)) / sqrt(r_p^2+x_p^2)
_PM.constraint_ne_thermal_limit_from(pm, n, br_idx_c, f_idx_c, rate_a_p)
end
"Nothing to do, this model is symmetric"
function _PM.constraint_thermal_limit_to(pm::BFARadPowerModel, n::Int, t_idx, rate_a)
end
"Nothing to do, this model is symmetric"
function _PM.constraint_thermal_limit_to_on_off(pm::BFARadPowerModel, n::Int, i, t_idx, rate_a)
end
"Nothing to do, this model is symmetric"
function _PM.constraint_ne_thermal_limit_to(pm::BFARadPowerModel, n::Int, i, t_idx, rate_a)
end
"Nothing to do, this model is symmetric"
function constraint_ne_thermal_limit_to_parallel(pm::BFARadPowerModel, n::Int, br_idx_e, br_idx_c, f_idx_c, rate_a_e, rate_a_c)
end
## Other functions
"""
Converts the solution data into the data model's standard space, polar voltages and rectangular power.
Bus voltage magnitude `vm` is the square root of `w`.
Voltage magnitude of the reference bus is 1.0 p.u.
Branch OLTC tap ratio `tm` (if applies) is the square root of the inverse of `ttmi`.
"""
function _PM.sol_data_model!(pm::BFARadPowerModel, solution::Dict)
if haskey(solution["it"]["pm"], "nw")
nws_sol = solution["it"]["pm"]["nw"]
else
nws_sol = Dict("0" => solution)
end
for (nw, nw_sol) in nws_sol
# Find reference bus id
n = parse(Int, nw)
ref_buses = _PM.ref(pm, n, :ref_buses)
if length(ref_buses) == 0
Memento.error(_LOGGER, "no reference bus found")
end
if length(ref_buses) > 1
Memento.error(_LOGGER, "networks with multiple reference buses are not supported")
end
ref_bus_id = first(keys(ref_buses))
# Bus voltage magnitude `vm` is the square root of `w`
for (i,bus) in nw_sol["bus"]
if haskey(bus, "w")
bus["vm"] = sqrt(bus["w"])
delete!(bus, "w")
end
end
# The voltage magnitude of the reference bus is 1.0 p.u.
nw_sol["bus"]["$ref_bus_id"]["vm"] = 1.0
# OLTC tap ratio `tm` of `branch`es (if applies) is the square root of the inverse of `ttmi`
for (i,br) in nw_sol["branch"]
if haskey(br, "ttmi")
if haskey(nw_sol,"ne_branch") && any([nw_sol["ne_branch"]["$ne_br_id"]["built"] for ne_br_id in ne_branch_ids(pm, parse(Int,i), nw = n)] .== 1.0)
# if branch is not built
br["tm"] = 0.0
else
br["tm"] = sqrt(1.0/br["ttmi"])
end
delete!(br, "ttmi")
end
end
# OLTC tap ratio `tm` of `ne_branch`es (if applies) is the square root of the inverse of `ttmi`
if haskey(nw_sol,"ne_branch")
for (i,br) in nw_sol["ne_branch"]
if haskey(br, "ttmi")
if br["built"] == 0.0
br["tm"] = 0.0
else
br["tm"] = sqrt(1.0/br["ttmi"])
end
delete!(br, "ttmi")
end
end
end
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 843 | # To be used instead of _PMACDC.variable_dc_converter_ne() - supports deduplication of variables
function variable_dc_converter_ne(pm::_PM.AbstractDCPModel; investment::Bool=true, kwargs...)
variable_ne_converter_indicator(pm; kwargs..., relax=true) # FlexPlan version: replaces _PMACDC.variable_converter_ne().
investment && variable_ne_converter_investment(pm; kwargs...)
_PMACDC.variable_converter_active_power_ne(pm; kwargs...)
_PMACDC.variable_dcside_power_ne(pm; kwargs...)
_PMACDC.variable_converter_filter_voltage_ne(pm; kwargs...)
_PMACDC.variable_converter_internal_voltage_ne(pm; kwargs...)
_PMACDC.variable_converter_to_grid_active_power_ne(pm; kwargs...)
_PMACDC.variable_conv_transformer_active_power_to_ne(pm; kwargs...)
_PMACDC.variable_conv_reactor_active_power_from_ne(pm; kwargs...)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 9534 | """
make_multinetwork(sn_data, time_series; <keyword arguments>)
Generate a multinetwork data structure from a single network and a time series.
# Arguments
- `sn_data`: single-network data structure to be replicated.
- `time_series`: data structure containing the time series.
- `global_keys`: keys that are stored once per multinetwork (they are not repeated in each
`nw`).
- `number_of_nws`: number of networks to be created from `sn_data` and `time_series`;
default: read from `dim`.
- `nw_id_offset`: optional value to be added to `time_series` ids to shift `nw` ids in
multinetwork data structure; default: read from `dim`.
- `share_data`: whether constant data is shared across networks (default, faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
- `check_dim`: whether to check for `dim` in `sn_data`; default: `true`.
"""
function make_multinetwork(
sn_data::Dict{String,Any},
time_series::Dict{String,Any};
global_keys = ["dim","multinetwork","name","per_unit","source_type","source_version"],
number_of_nws::Int = length(dim(sn_data)[:li]),
nw_id_offset::Int = dim(sn_data)[:offset],
share_data::Bool = true,
check_dim::Bool = true
)
if _IM.ismultinetwork(sn_data)
Memento.error(_LOGGER, "`sn_data` argument must be a single network.")
end
if check_dim && !haskey(sn_data, "dim")
Memento.error(_LOGGER, "Missing `dim` dict in `sn_data` argument. The function `add_dimension!` must be called before `make_multinetwork`.")
end
mn_data = Dict{String,Any}("nw"=>Dict{String,Any}())
_add_mn_global_values!(mn_data, sn_data, global_keys)
_add_time_series!(mn_data, sn_data, global_keys, time_series, number_of_nws, nw_id_offset; share_data)
return mn_data
end
"""
make_multinetwork(sn_data; global_keys)
Generate a multinetwork data structure - having only one `nw` - from a single network.
# Arguments
- `sn_data`: single-network data structure to be replicated.
- `global_keys`: keys that are stored once per multinetwork (they are not repeated in each
`nw`).
- `check_dim`: whether to check for `dim` in `sn_data`; default: `true`.
"""
function make_multinetwork(
sn_data::Dict{String,Any};
global_keys = ["dim","name","per_unit","source_type","source_version"],
check_dim::Bool = true
)
if _IM.ismultinetwork(sn_data)
Memento.error(_LOGGER, "`sn_data` argument must be a single network.")
end
if check_dim && !haskey(sn_data, "dim")
Memento.error(_LOGGER, "Missing `dim` dict in `sn_data` argument. The function `add_dimension!` must be called before `make_multinetwork`.")
end
mn_data = Dict{String,Any}("nw"=>Dict{String,Any}())
_add_mn_global_values!(mn_data, sn_data, global_keys)
template_nw = _make_template_nw(sn_data, global_keys)
mn_data["nw"]["1"] = copy(template_nw)
return mn_data
end
"""
shift_nws!(mn_data, offset=dim_length(mn_data))
Shift by `offset` the networks in `mn_data`.
The `offset` argument is added to the existing offset.
Return the updated `mn_data` variable.
`mn_data` must be a multinetwork `data` dictionary.
If possible, use `shift_ids!` instead.
See also: `shift_ids!`.
"""
function shift_nws!(mn_data::Dict{String,Any}, offset::Int=dim_length(mn_data))
if !_IM.ismultinetwork(mn_data)
Memento.error(_LOGGER, "`mn_data` argument must be a multinetwork.")
end
shift_ids!(mn_data["dim"], offset)
shifted_nw = Dict{String,Any}()
for (n,nw) in mn_data["nw"]
shifted_nw["$(parse(Int,n)+offset)"] = nw
end
mn_data["nw"] = shifted_nw
return mn_data
end
"""
import_nws!(mn_data, others...)
Import into `mn_data["nw"]` the `nw`s contained in `others`.
`nw` ids of the two multinetworks must be contiguous (an error is raised otherwise).
See also: `merge_multinetworks!`.
"""
function import_nws!(mn_data::Dict{String,Any}, others::Dict{String,Any}...)
if !_IM.ismultinetwork(mn_data)
Memento.error(_LOGGER, "`import_nws!` can only be applied to multinetwork data dictionaries.")
end
for other in others
if !isempty(keys(mn_data["nw"]) ∩ keys(other["nw"]))
Memento.error(_LOGGER, "Attempting to import multinetworks having overlapping `nw` ids.")
end
merge!(mn_data["nw"], other["nw"])
end
first_id, last_id = extrema(parse.(Int,keys(mn_data["nw"])))
if length(mn_data["nw"]) != last_id - first_id + 1
Memento.error(_LOGGER, "The ids of the imported `nw`s must be contiguous.")
end
return mn_data
end
"""
merge_multinetworks!(mn_data_1, mn_data_2, dimension)
Merge `mn_data_2` into `mn_data_1` along `dimension`.
`nw` ids of the two multinetworks must be contiguous (an error is raised otherwise).
Fields present in `mn_data_1` but not in `mn_data_2` are shallow-copied into `mn_data_1`.
Fields present in both multinetworks must be equal, except for `dim`, `nw` and possibly for
`name`.
See also: `import_nws!`.
"""
function merge_multinetworks!(mn_data_1::Dict{String,Any}, mn_data_2::Dict{String,Any}, dimension::Symbol)
for k in ("dim", "nw")
for data in (mn_data_1, mn_data_2)
if k ∉ keys(data)
Memento.error(_LOGGER, "Missing field $k from input data dictionary.")
end
end
end
mn_data_1["dim"] = merge_dim!(mn_data_1["dim"], mn_data_2["dim"], dimension)
import_nws!(mn_data_1, mn_data_2)
keys1 = setdiff(keys(mn_data_1), ("dim", "nw"))
keys2 = setdiff(keys(mn_data_1), ("dim", "nw"))
for k in keys1 ∩ keys2
if mn_data_1[k] == mn_data_2[k]
continue
elseif k == "name" # Applied only if names are different
mn_data_1["name"] = "Merged multinetwork"
else
Memento.error(_LOGGER, "Attempting to merge multinetworks that differ on the value of \"$k\".")
end
end
for k in setdiff(keys2, keys1)
mn_data_1[k] = mn_data_2[k]
end
return mn_data_1
end
"""
slice = slice_multinetwork(data::Dict{String,Any}; kwargs...)
Slice a multinetwork keeping the networks that have the coordinates specified by `kwargs`.
`kwargs` must be of the form `name = <value>`, where `name` is the name of a dimension of
`dim` and `<value>` is an `Int` coordinate of that dimension.
Return a sliced multinetwork that shares its data with `data`.
The coordinates of the dimensions at which the original multinetwork is sliced are
accessible with `dim_meta(slice, <name>, "orig_id")` where `<name>` is the name of one of
those dimensions.
Forward and backward lookup dicts containing the network ids of `data` and `slice` are
accessible with `slice["slice"]["slice_orig_nw_lookup"]` and
`slice["slice"]["orig_slice_nw_lookup"]`.
"""
function slice_multinetwork(data::Dict{String,Any}; kwargs...)
slice = Dict{String,Any}()
for k in setdiff(keys(data), ("dim", "nw"))
slice[k] = data[k]
end
dim_dict, ids = slice_dim(dim(data); kwargs...)
slice["dim"] = dim_dict
slice["nw"] = Dict{String,Any}()
for (new_id, old_id) in enumerate(ids)
slice["nw"]["$new_id"] = data["nw"]["$old_id"]
end
slice["slice"] = Dict{String,Any}()
slice["slice"]["slice_orig_nw_lookup"] = Dict(enumerate(ids))
slice["slice"]["orig_slice_nw_lookup"] = Dict((o,s) for (s,o) in slice["slice"]["slice_orig_nw_lookup"])
return slice
end
# Copy global values from sn_data to mn_data handling special cases
function _add_mn_global_values!(mn_data, sn_data, global_keys)
# Insert global values into mn_data by copying from sn_data
for k in global_keys
if haskey(sn_data, k)
mn_data[k] = sn_data[k]
end
end
# Special cases are handled below
mn_data["multinetwork"] = true
get!(mn_data, "name", "multinetwork")
end
# Make a copy of `data` and remove global keys
function _make_template_nw(sn_data, global_keys)
template_nw = copy(sn_data)
for k in global_keys
delete!(template_nw, k)
end
return template_nw
end
# Build multinetwork data structure: for each network, replicate the template and replace with data from time_series
function _add_time_series!(mn_data, sn_data, global_keys, time_series, number_of_nws, offset; share_data)
template_nw = _make_template_nw(sn_data, global_keys)
for time_series_idx in 1:number_of_nws
n = time_series_idx + offset
mn_data["nw"]["$n"] = _build_nw(template_nw, time_series, time_series_idx; share_data)
end
end
# Build the nw by copying the template and substituting data from time_series.
function _build_nw(template_nw, time_series, idx; share_data)
copy_function = share_data ? copy : deepcopy
nw = copy_function(template_nw)
for (key, element) in time_series
if haskey(nw, key)
nw[key] = copy_function(template_nw[key])
for (l, element) in time_series[key]
if haskey(nw[key], l)
nw[key][l] = copy_function(template_nw[key][l])
for (m, property) in time_series[key][l]
nw[key][l][m] = property[idx]
end
else
Memento.warn(_LOGGER, "Key $l not found, will be ignored.")
end
end
else
Memento.warn(_LOGGER, "Key $key not found, will be ignored.")
end
end
return nw
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 8194 | """
parse_file(file; flex_load=true, <keyword arguments>)
Parse a Matpower .m `file` or PTI (PSS(R)E-v33) .raw `file` into a FlexPlan data structure,
including non-dispatchable generators, DC components, storage and flexible loads.
`flex_load` specifies whether to process flexible load data.
Other keyword arguments, if any, are forwarded to `PowerModels.parse_file`.
Mandatory tables: `bus`, `gen`, `branch` (and `load_extra` if `flex_load==true`).
Optional tables: `gencost`, `ndgen`, `branch_oltc`, `storage`, `storage_extra`,
`ne_storage`, and tables used by PowerModelsACDC.
Other tables can be added as well: they will be made available in the returned object.
"""
function parse_file(file::String; flex_load=true, kwargs...)
data = _PM.parse_file(file; kwargs...)
_add_gen_data!(data)
if !haskey(data, "ne_branch")
data["ne_branch"] = Dict{String,Any}()
end
if haskey(data, "busdc") || haskey(data, "busdc_ne")
_PMACDC.process_additional_data!(data)
end
_add_storage_data!(data)
if flex_load
if !haskey(data, "load_extra")
Memento.error(_LOGGER, "No `load_extra` table found in input file.")
end
_add_flexible_demand_data!(data)
end
return data
end
"Add a `dispatchable` bool field to all generators; add non-dispatchable generators to `data[\"gen\"]`."
function _add_gen_data!(data::Dict{String,Any})
for dgen in values(data["gen"])
dgen["dispatchable"] = true
end
if haskey(data, "ndgen")
offset = length(data["gen"])
rescale = x -> x/data["baseMVA"]
rescale_dual = x -> x*data["baseMVA"]
for ndgen in values(data["ndgen"])
ndgen["dispatchable"] = false
# Convert to p.u.
_PM._apply_func!(ndgen, "pref", rescale)
_PM._apply_func!(ndgen, "qmax", rescale)
_PM._apply_func!(ndgen, "qmin", rescale)
_PM._apply_func!(ndgen, "cost_gen", rescale_dual)
_PM._apply_func!(ndgen, "cost_curt", rescale_dual)
# Define active power bounds using the same names used by dispatchable
# generators.
ndgen["pmin"] = 0.0
ndgen["pmax"] = ndgen["pref"]
delete!(ndgen, "pref")
# Convert the cost of power produced by non-dispatchable generators into
# polynomial form (the same used by dispatchable generators).
ndgen["model"] = 2 # Cost model (2 => polynomial cost)
ndgen["cost"] = [ndgen["cost_gen"], 0.0]
delete!(ndgen, "cost_gen")
# Assign to non-dispatchable generators ids contiguous to dispatchable
# generators so that each generator has an unique id.
new_id = ndgen["index"] += offset
data["gen"]["$new_id"] = ndgen
end
delete!(data, "ndgen")
end
return data
end
function _add_storage_data!(data)
if haskey(data, "storage")
for (s, storage) in data["storage"]
rescale_power = x -> x/data["baseMVA"]
_PM._apply_func!(storage, "max_energy_absorption", rescale_power)
_PM._apply_func!(storage, "stationary_energy_outflow", rescale_power)
_PM._apply_func!(storage, "stationary_energy_inflow", rescale_power)
end
else
data["storage"] = Dict{String,Any}()
end
if haskey(data, "ne_storage")
for (s, storage) in data["ne_storage"]
rescale_power = x -> x/data["baseMVA"]
_PM._apply_func!(storage, "energy_rating", rescale_power)
_PM._apply_func!(storage, "thermal_rating", rescale_power)
_PM._apply_func!(storage, "discharge_rating", rescale_power)
_PM._apply_func!(storage, "charge_rating", rescale_power)
_PM._apply_func!(storage, "energy", rescale_power)
_PM._apply_func!(storage, "ps", rescale_power)
_PM._apply_func!(storage, "qs", rescale_power)
_PM._apply_func!(storage, "q_loss", rescale_power)
_PM._apply_func!(storage, "p_loss", rescale_power)
_PM._apply_func!(storage, "qmax", rescale_power)
_PM._apply_func!(storage, "qmin", rescale_power)
_PM._apply_func!(storage, "max_energy_absorption", rescale_power)
_PM._apply_func!(storage, "stationary_energy_outflow", rescale_power)
_PM._apply_func!(storage, "stationary_energy_inflow", rescale_power)
end
else
data["ne_storage"] = Dict{String,Any}()
end
return data
end
function _add_flexible_demand_data!(data)
for (le, load_extra) in data["load_extra"]
# ID of load point
idx = load_extra["load_id"]
# Superior bound on voluntary load reduction (not consumed power) as a fraction of the total reference demand (0 ≤ pred_rel_max ≤ 1)
data["load"]["$idx"]["pred_rel_max"] = load_extra["pred_rel_max"]
# Superior bound on upward demand shifted as a fraction of the total reference demand (0 ≤ pshift_up_rel_max ≤ 1)
data["load"]["$idx"]["pshift_up_rel_max"] = load_extra["pshift_up_rel_max"]
# Superior bound on downward demand shifted as a fraction of the total reference demand (0 ≤ pshift_down_rel_max ≤ 1)
data["load"]["$idx"]["pshift_down_rel_max"] = load_extra["pshift_down_rel_max"]
# Superior bound on shifted energy as a fraction of the total reference demand (0 ≤ eshift_rel_max ≤ 1)
if haskey(load_extra, "eshift_rel_max")
data["load"]["$idx"]["eshift_rel_max"] = load_extra["eshift_rel_max"]
end
# Compensation for consuming less (i.e. voluntary demand reduction) (€/MWh)
data["load"]["$idx"]["cost_red"] = load_extra["cost_red"]
# Recovery period for upward demand shifting (h)
if haskey(load_extra, "tshift_up")
data["load"]["$idx"]["tshift_up"] = load_extra["tshift_up"]
end
# Recovery period for downward demand shifting (h)
if haskey(load_extra, "tshift_down")
data["load"]["$idx"]["tshift_down"] = load_extra["tshift_down"]
end
# Compensation for demand shifting (€/MWh), applied half to the power shifted upward and half to the power shifted downward
data["load"]["$idx"]["cost_shift"] = load_extra["cost_shift"]
# Compensation for load curtailment (i.e. involuntary demand reduction) (€/MWh)
data["load"]["$idx"]["cost_curt"] = load_extra["cost_curt"]
# Investment costs for enabling flexible demand (€)
data["load"]["$idx"]["cost_inv"] = load_extra["cost_inv"]
# Whether load is flexible (boolean)
data["load"]["$idx"]["flex"] = load_extra["flex"]
# Superior bound on voluntary energy reduction as a fraction of the total reference demand (0 ≤ ered_rel_max ≤ 1)
if haskey(load_extra, "ered_rel_max")
data["load"]["$idx"]["ered_rel_max"] = load_extra["ered_rel_max"]
end
# Expected lifetime of flexibility-enabling equipment (years)
data["load"]["$idx"]["lifetime"] = load_extra["lifetime"]
# CO2 costs for enabling flexible demand (€)
if haskey(load_extra, "co2_cost")
data["load"]["$idx"]["co2_cost"] = load_extra["co2_cost"]
end
# Power factor angle θ, giving the reactive power as Q = P ⨉ tan(θ)
if haskey(load_extra, "pf_angle")
data["load"]["$idx"]["pf_angle"] = load_extra["pf_angle"]
end
# Rescale cost and power input values to the p.u. values used internally in the model
rescale_cost = x -> x*data["baseMVA"]
rescale_power = x -> x/data["baseMVA"]
_PM._apply_func!(data["load"]["$idx"], "cost_red", rescale_cost)
_PM._apply_func!(data["load"]["$idx"], "cost_shift", rescale_cost)
_PM._apply_func!(data["load"]["$idx"], "cost_curt", rescale_cost)
end
delete!(data, "load_extra")
return data
end
function _add_generation_emission_data!(data)
rescale_emission = x -> x * data["baseMVA"]
for (g, gen) in data["gen"]
_PM._apply_func!(gen, "emission_factor", rescale_emission)
end
return data
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 11267 | function plot_geo_data(data_in, filename, settings; solution = Dict())
io = open(filename, "w")
println(io, string("<?xml version=",raw"\"","1.0",raw"\""," encoding=",raw"\"","UTF-8",raw"\"","?>")) #
println(io, string("<kml xmlns=",raw"\"","http://earth.google.com/kml/2.1",raw"\"",">")) #
println(io, string("<Document>"))
draw_order = 1
if haskey(solution, "solution")
if haskey(solution["solution"],"nw")
sol = solution["solution"]["nw"]["1"]
else
sol = solution["solution"]
end
end
if haskey(data_in, "solution")
if haskey(data_in, "nw")
data = data_in["solution"]["nw"]["1"]
else
data = data_in["solution"]
end
else
if haskey(data_in, "nw")
data = data_in["nw"]["1"]
else
data = data_in
end
end
if settings["add_nodes"] == true
for (b, bus) in data["bus"]
_plot_bus(io, bus, b)
end
end
for (b, branch) in data["branch"]
_plot_branch(io, branch, b, data; color_in = "blue")
end
if haskey(data, "ne_branch")
for (b, branch) in data["ne_branch"]
if haskey(settings, "plot_solution_only")
if sol["ne_branch"]["$b"]["built"] == 1
_plot_branch(io, branch, b, data; color_in = "blue", name = "Candidate Line")
end
else
_plot_branch(io, branch, b, data; color_in = "green", name = "Candidate Line")
end
end
end
if haskey(data, "branchdc")
for (bdc, branchdc) in data["branchdc"]
_plot_dc_branch(io, branchdc, bdc, data; color_in = "yellow")
end
end
if haskey(data, "branchdc_ne")
for (bdc, branchdc) in data["branchdc_ne"]
if haskey(settings, "plot_solution_only")
if sol["branchdc_ne"]["$bdc"]["isbuilt"] == 1
_plot_dc_branch(io, branchdc, bdc, data; color_in = "yellow", name = "Candidate DC Line")
end
else
_plot_dc_branch(io, branchdc, bdc, data; color_in = "red", name = "Candidate DC Line")
end
end
end
if haskey(data, "convdc")
for (cdc, convdc) in data["convdc"]
_plot_dc_conv(io, convdc, cdc, data; color_in = "yellow")
end
end
if haskey(data, "convdc_ne")
for (cdc, convdc) in data["convdc_ne"]
if haskey(settings, "plot_solution_only")
if sol["convdc_ne"]["$cdc"]["isbuilt"] == 1
_plot_dc_conv(io, convdc, cdc, data; color_in = "yellow", name = "Candidate DC converter")
end
else
_plot_dc_conv(io, convdc, cdc, data; color_in = "red", name = "Candidate DC converter")
end
end
end
if haskey(data, "storage")
for (s, storage) in data["storage"]
_plot_storage(io, storage, s, data; color_in = "yellow")
end
end
if haskey(data, "ne_storage")
for (s, storage) in data["ne_storage"]
if haskey(settings, "plot_solution_only")
if sol["ne_storage"]["$s"]["isbuilt"] == 1
_plot_storage(io, storage, s, data; color_in = "yellow", name = "Candidate storage")
end
else
_plot_storage(io, storage, s, data; color_in = "red", name = "Candidate storage")
end
end
end
println(io, string("</Document>"))
println(io, string("</kml>"))
close(io)
end
function _plot_bus(io, bus, b)
lat = bus["lat"]
lon = bus["lon"]
println(io, string("<Placemark> "));
println(io, string("<name>Node","$b","</name> "));
#println(io, string("<description>drawOrder=","$draw_order","</description>"));
println(io, string("<ExtendedData> "));
println(io, string("<SimpleData name=",raw"\"","Name",raw"\"",">Node 1</SimpleData>"));
println(io, string("<SimpleData name=",raw"\"","Description",raw"\"","></SimpleData> "));
println(io, string("<SimpleData name=",raw"\"","Latitude",raw"\"",">","$lat","</SimpleData>"));
println(io, string("<SimpleData name=",raw"\"","Longitude",raw"\"",">","$lon","</SimpleData>"));
println(io, string("<SimpleData name=",raw"\"","Icon",raw"\"","></SimpleData> "));
println(io, string("</ExtendedData>"));
println(io, string("<Point> "));
println(io, string("<coordinates>","$lon",",","$lat",",","0","</coordinates>"));
println(io, string("</Point> "));
println(io, string("<Style id=",raw"\"","downArrowIcon",raw"\"",">"));
println(io, string("<IconStyle> "));
println(io, string("<Icon> "));
println(io, string("<href>http://maps.google.com/mapfiles/kml/shapes/placemark_circle_highlight.png</href>"));
println(io, string("</Icon> "));
println(io, string("</IconStyle> "));
println(io, string("</Style> "));
println(io, string("</Placemark> "));
return io
end
function _plot_branch(io, branch, b, data; color_in = "blue", name = "Line")
println(io, string("<Placemark> "));
println(io, string("<name>",name,"$b","</name> "))
println(io, string("<LineString>"))
println(io, string("<tessellate>1</tessellate>"))
println(io, string("<coordinates>"))
fbus = branch["f_bus"]
tbus = branch["t_bus"]
fbus_lat = data["bus"]["$fbus"]["lat"]
fbus_lon = data["bus"]["$fbus"]["lon"]
tbus_lat = data["bus"]["$tbus"]["lat"]
tbus_lon = data["bus"]["$tbus"]["lon"]
println(io, string("$fbus_lon",",","$fbus_lat","0"))
println(io, string("$tbus_lon",",","$tbus_lat","0"))
println(io, string("</coordinates>"))
println(io, string("</LineString>"))
println(io, string("<Style>"))
println(io, string("<LineStyle>"))
if color_in == "green"
color = "#FF14F000"
else
color = "#FFF00014"
end
println(io, string("<color>",color,"</color>"))
#println(io, string("<description>drawOrder=","$draw_order","</description>"))
println(io, string("<width>3</width>"))
println(io, string("</LineStyle>"))
println(io, string("</Style>"))
println(io, string("</Placemark>"))
return io
end
function _plot_dc_branch(io, branch, b, data; color_in = "yellow", name = "DC Line")
println(io, string("<Placemark> "));
println(io, string("<name>",name,"$b","</name> "))
println(io, string("<LineString>"))
println(io, string("<tessellate>1</tessellate>"))
println(io, string("<coordinates>"))
fbusdc = branch["fbusdc"]
tbusdc = branch["tbusdc"]
fbus = 0
tbus = 0
if haskey(data, "convdc")
for (c, conv) in data["convdc"]
if conv["busdc_i"] == fbusdc
fbus = conv["busac_i"]
end
if conv["busdc_i"] == tbusdc
tbus = conv["busac_i"]
end
end
end
if haskey(data, "convdc_ne")
for (c, conv) in data["convdc_ne"]
if conv["busdc_i"] == fbusdc
fbus = conv["busac_i"]
end
if conv["busdc_i"] == tbusdc
tbus = conv["busac_i"]
end
end
end
fbus_lat = data["bus"]["$fbus"]["lat"]
fbus_lon = data["bus"]["$fbus"]["lon"]
tbus_lat = data["bus"]["$tbus"]["lat"]
tbus_lon = data["bus"]["$tbus"]["lon"]
println(io, string("$fbus_lon",",","$fbus_lat","0"))
println(io, string("$tbus_lon",",","$tbus_lat","0"))
println(io, string("</coordinates>"))
println(io, string("</LineString>"))
println(io, string("<Style>"))
println(io, string("<LineStyle>"))
if color_in == "red"
color = "#FF1400FF"
else
color = "#FF14F0FF"
end
println(io, string("<color>",color,"</color>"))
#println(io, string("<description>drawOrder=","$draw_order","</description>"))
println(io, string("<width>3</width>"))
println(io, string("</LineStyle>"))
println(io, string("</Style>"))
println(io, string("</Placemark>"))
return io
end
function _plot_dc_conv(io, conv, c, data; color_in = "yellow", name = "DC Converter")
println(io, string("<Placemark> "));
println(io, string("<name>",name,"$c","</name> "))
println(io, string("<LineString>"))
println(io, string("<tessellate>1</tessellate>"))
println(io, string("<coordinates>"))
bus = conv["busac_i"]
bus_lat = data["bus"]["$bus"]["lat"]
bus_lon = data["bus"]["$bus"]["lon"]
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat + 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat - 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon - 0.05
bus_lat1 = bus_lat - 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon - 0.05
bus_lat1 = bus_lat + 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat + 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
println(io, string("</coordinates>"))
println(io, string("</LineString>"))
println(io, string("<Style>"))
println(io, string("<LineStyle>"))
if color_in == "red"
color = "#FF1400FF"
else
color = "#FF14F0FF"
end
println(io, string("<color>",color,"</color>"))
#println(io, string("<description>drawOrder=","$draw_order","</description>"))
println(io, string("<width>3</width>"))
println(io, string("</LineStyle>"))
println(io, string("</Style>"))
println(io, string("</Placemark>"))
return io
end
function _plot_storage(io, storage, s, data; color_in = "yellow", name = "Storage")
println(io, string("<Placemark> "));
println(io, string("<name>",name,"$s","</name> "))
println(io, string("<LineString>"))
println(io, string("<tessellate>1</tessellate>"))
println(io, string("<coordinates>"))
bus = storage["storage_bus"]
bus_lat = data["bus"]["$bus"]["lat"]
bus_lon = data["bus"]["$bus"]["lon"]
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat + 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat - 0.05
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon - 0.025
bus_lat1 = bus_lat - 0.025
println(io, string("$bus_lon1",",","$bus_lat1","0"))
bus_lon1 = bus_lon + 0.05
bus_lat1 = bus_lat + 0.05
# println(io, string("$bus_lon1",",","$bus_lat1","0"))
# bus_lon1 = bus_lon + 0.03
# bus_lat1 = bus_lat + 0.03
println(io, string("$bus_lon1",",","$bus_lat1","0"))
println(io, string("</coordinates>"))
println(io, string("</LineString>"))
println(io, string("<Style>"))
println(io, string("<LineStyle>"))
if color_in == "red"
color = "#FF1400FF"
else
color = "#FF14F0FF"
end
println(io, string("<color>",color,"</color>"))
#println(io, string("<description>drawOrder=","$draw_order","</description>"))
println(io, string("<width>3</width>"))
println(io, string("</LineStyle>"))
println(io, string("</Style>"))
println(io, string("</Placemark>"))
return io
end | FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 15712 | """
scale_data!(data; <keyword arguments>)
Scale lifetime and cost data.
See `_scale_time_data!`, `_scale_operational_cost_data!` and `_scale_investment_cost_data!`.
# Arguments
- `data`: a single-network data dictionary.
- `number_of_hours`: number of optimization periods (default: `dim_length(data, :hour)`).
- `year_scale_factor`: how many years a representative year should represent (default: `dim_meta(data, :year, "scale_factor")`).
- `number_of_years`: number of representative years (default: `dim_length(data, :year)`).
- `year_idx`: id of the representative year (default: `1`).
- `cost_scale_factor`: scale factor for all costs (default: `1.0`).
"""
function scale_data!(
data::Dict{String,Any};
number_of_hours::Int = haskey(data, "dim") ? dim_length(data, :hour) : 1,
year_scale_factor::Int = haskey(data, "dim") ? dim_meta(data, :year, "scale_factor") : 1,
number_of_years::Int = haskey(data, "dim") ? dim_length(data, :year) : 1,
year_idx::Int = 1,
cost_scale_factor::Real = 1.0
)
if _IM.ismultinetwork(data)
Memento.error(_LOGGER, "`scale_data!` can only be applied to single-network data dictionaries.")
end
_scale_time_data!(data, year_scale_factor)
_scale_operational_cost_data!(data, number_of_hours, year_scale_factor, cost_scale_factor)
_scale_investment_cost_data!(data, number_of_years, year_idx, cost_scale_factor) # Must be called after `_scale_time_data!`
end
"""
_scale_time_data!(data, year_scale_factor)
Scale lifetime data from years to periods of `year_scale_factor` years.
After applying this function, the step between consecutive years takes the value 1: in this
way it is easier to write the constraints that link variables belonging to different years.
"""
function _scale_time_data!(data, year_scale_factor)
rescale = x -> x ÷ year_scale_factor
for component in ("ne_branch", "branchdc_ne", "ne_storage", "convdc_ne", "load")
for (key, val) in get(data, component, Dict{String,Any}())
if !haskey(val, "lifetime")
if component == "load"
continue # "lifetime" field is not used in OPF
else
Memento.error(_LOGGER, "Missing `lifetime` key in `$component` $key.")
end
end
if val["lifetime"] % year_scale_factor != 0
Memento.error(_LOGGER, "Lifetime of $component $key ($(val["lifetime"])) must be a multiple of the year scale factor ($year_scale_factor).")
end
_PM._apply_func!(val, "lifetime", rescale)
end
end
end
"""
_scale_operational_cost_data!(data, number_of_hours, year_scale_factor, cost_scale_factor)
Scale hourly costs to the planning horizon.
Scale hourly costs so that the sum of the costs over all optimization periods
(`number_of_hours` hours) represents the cost over the entire planning horizon
(`year_scale_factor` years). In this way it is possible to perform the optimization using a
reduced number of hours and still obtain a cost that approximates the cost that would be
obtained if 8760 hours were used for each year.
"""
function _scale_operational_cost_data!(data, number_of_hours, year_scale_factor, cost_scale_factor)
rescale = x -> (8760*year_scale_factor / number_of_hours) * cost_scale_factor * x # scale hourly costs to the planning horizon
for (g, gen) in data["gen"]
_PM._apply_func!(gen, "cost", rescale)
_PM._apply_func!(gen, "cost_curt", rescale)
end
for (l, load) in data["load"]
_PM._apply_func!(load, "cost_shift", rescale) # Compensation for demand shifting
_PM._apply_func!(load, "cost_curt", rescale) # Compensation for load curtailment (i.e. involuntary demand reduction)
_PM._apply_func!(load, "cost_red", rescale) # Compensation for not consumed energy (i.e. voluntary demand reduction)
end
_PM._apply_func!(data, "co2_emission_cost", rescale)
end
"""
_scale_investment_cost_data!(data, number_of_years, year_idx, cost_scale_factor)
Correct investment costs considering the residual value at the end of the planning horizon.
Linear depreciation is assumed.
This function _must_ be called after `_scale_time_data!`.
"""
function _scale_investment_cost_data!(data, number_of_years, year_idx, cost_scale_factor)
# Assumption: the `lifetime` parameter of investment candidates has already been scaled
# using `_scale_time_data!`.
remaining_years = number_of_years - year_idx + 1
for (b, branch) in get(data, "ne_branch", Dict{String,Any}())
rescale = x -> min(remaining_years/branch["lifetime"], 1.0) * cost_scale_factor * x
_PM._apply_func!(branch, "construction_cost", rescale)
_PM._apply_func!(branch, "co2_cost", rescale)
end
for (b, branch) in get(data, "branchdc_ne", Dict{String,Any}())
rescale = x -> min(remaining_years/branch["lifetime"], 1.0) * cost_scale_factor * x
_PM._apply_func!(branch, "cost", rescale)
_PM._apply_func!(branch, "co2_cost", rescale)
end
for (c, conv) in get(data, "convdc_ne", Dict{String,Any}())
rescale = x -> min(remaining_years/conv["lifetime"], 1.0) * cost_scale_factor * x
_PM._apply_func!(conv, "cost", rescale)
_PM._apply_func!(conv, "co2_cost", rescale)
end
for (s, strg) in get(data, "ne_storage", Dict{String,Any}())
rescale = x -> min(remaining_years/strg["lifetime"], 1.0) * cost_scale_factor * x
_PM._apply_func!(strg, "eq_cost", rescale)
_PM._apply_func!(strg, "inst_cost", rescale)
_PM._apply_func!(strg, "co2_cost", rescale)
end
for (l, load) in data["load"]
rescale = x -> min(remaining_years/load["lifetime"], 1.0) * cost_scale_factor * x
_PM._apply_func!(load, "cost_inv", rescale)
_PM._apply_func!(load, "co2_cost", rescale)
end
end
"""
convert_mva_base(data, mva_base)
Convert a data or solution Dict to a different per-unit system MVA base value.
`data` can be single-network or multinetwork, but must already be in p.u.
!!! danger
In case of multinetworks, make sure that variables from different networks are not bound
to the same value in memory (i.e., it must not happen that
`data["nw"][n1][...][key] === data["nw"][n2][...][key]`), otherwise the conversion of
those variables may be applied multiple times.
"""
function convert_mva_base!(data::Dict{String,<:Any}, mva_base::Real)
if haskey(data, "nw")
nws = data["nw"]
else
nws = Dict("0" => data)
end
for data_nw in values(nws)
if data_nw["baseMVA"] ≠ mva_base
mva_base_ratio = mva_base / data_nw["baseMVA"]
rescale = x -> x / mva_base_ratio
rescale_inverse = x -> x * mva_base_ratio
_PM._apply_func!(data_nw, "baseMVA", rescale_inverse)
if haskey(data_nw, "bus")
for (i, bus) in data_nw["bus"]
_PM._apply_func!(bus, "lam_kcl_i", rescale_inverse)
_PM._apply_func!(bus, "lam_kcl_r", rescale_inverse)
end
end
for comp in ["branch", "ne_branch"]
if haskey(data_nw, comp)
for (i, branch) in data_nw[comp]
_PM._apply_func!(branch, "b_fr", rescale)
_PM._apply_func!(branch, "b_to", rescale)
_PM._apply_func!(branch, "br_r", rescale_inverse)
_PM._apply_func!(branch, "br_x", rescale_inverse)
_PM._apply_func!(branch, "c_rating_a", rescale)
_PM._apply_func!(branch, "c_rating_b", rescale)
_PM._apply_func!(branch, "c_rating_c", rescale)
_PM._apply_func!(branch, "g_fr", rescale)
_PM._apply_func!(branch, "g_to", rescale)
_PM._apply_func!(branch, "rate_a", rescale)
_PM._apply_func!(branch, "rate_b", rescale)
_PM._apply_func!(branch, "rate_c", rescale)
_PM._apply_func!(branch, "mu_sm_fr", rescale_inverse)
_PM._apply_func!(branch, "mu_sm_to", rescale_inverse)
_PM._apply_func!(branch, "pf", rescale)
_PM._apply_func!(branch, "pt", rescale)
_PM._apply_func!(branch, "qf", rescale)
_PM._apply_func!(branch, "qt", rescale)
end
end
end
if haskey(data_nw, "switch")
for (i, switch) in data_nw["switch"]
_PM._apply_func!(switch, "current_rating", rescale)
_PM._apply_func!(switch, "psw", rescale)
_PM._apply_func!(switch, "qsw", rescale)
_PM._apply_func!(switch, "thermal_rating", rescale)
end
end
for comp in ["busdc", "busdc_ne"]
if haskey(data_nw, comp)
for (i, bus) in data_nw[comp]
_PM._apply_func!(bus, "Pdc", rescale)
end
end
end
for comp in ["branchdc", "branchdc_ne"]
if haskey(data_nw, comp)
for (i, branch) in data_nw[comp]
_PM._apply_func!(branch, "l", rescale_inverse)
_PM._apply_func!(branch, "r", rescale_inverse)
_PM._apply_func!(branch, "rateA", rescale)
_PM._apply_func!(branch, "rateB", rescale)
_PM._apply_func!(branch, "rateC", rescale)
_PM._apply_func!(branch, "pf", rescale)
_PM._apply_func!(branch, "pt", rescale)
end
end
end
for comp in ["convdc", "convdc_ne"]
if haskey(data_nw, comp)
for (i, conv) in data_nw[comp]
_PM._apply_func!(conv, "bf", rescale)
_PM._apply_func!(conv, "droop", rescale)
_PM._apply_func!(conv, "Imax", rescale)
_PM._apply_func!(conv, "LossA", rescale)
_PM._apply_func!(conv, "LossCinv", rescale_inverse)
_PM._apply_func!(conv, "LossCrec", rescale_inverse)
_PM._apply_func!(conv, "P_g", rescale)
_PM._apply_func!(conv, "Pacmax", rescale)
_PM._apply_func!(conv, "Pacmin", rescale)
_PM._apply_func!(conv, "Pacrated", rescale)
_PM._apply_func!(conv, "Pdcset", rescale)
_PM._apply_func!(conv, "Q_g", rescale)
_PM._apply_func!(conv, "Qacmax", rescale)
_PM._apply_func!(conv, "Qacmin", rescale)
_PM._apply_func!(conv, "Qacrated", rescale)
_PM._apply_func!(conv, "rc", rescale_inverse)
_PM._apply_func!(conv, "rtf", rescale_inverse)
_PM._apply_func!(conv, "xc", rescale_inverse)
_PM._apply_func!(conv, "xtf", rescale_inverse)
_PM._apply_func!(conv, "pconv", rescale)
_PM._apply_func!(conv, "pdc", rescale)
_PM._apply_func!(conv, "pgrid", rescale)
_PM._apply_func!(conv, "ppr_fr", rescale)
_PM._apply_func!(conv, "ptf_to", rescale)
end
end
end
if haskey(data_nw, "gen")
for (i, gen) in data_nw["gen"]
_PM._rescale_cost_model!(gen, mva_base_ratio)
_PM._apply_func!(gen, "cost_curt", rescale_inverse)
_PM._apply_func!(gen, "mbase", rescale_inverse)
_PM._apply_func!(gen, "pmax", rescale)
_PM._apply_func!(gen, "pmin", rescale)
_PM._apply_func!(gen, "qmax", rescale)
_PM._apply_func!(gen, "qmin", rescale)
_PM._apply_func!(gen, "ramp_10", rescale)
_PM._apply_func!(gen, "ramp_30", rescale)
_PM._apply_func!(gen, "ramp_agc", rescale)
_PM._apply_func!(gen, "ramp_q", rescale)
_PM._apply_func!(gen, "pg", rescale)
_PM._apply_func!(gen, "pgcurt", rescale)
_PM._apply_func!(gen, "qg", rescale)
end
end
for comp in ["storage", "ne_storage"]
if haskey(data_nw, comp)
for (i, strg) in data_nw[comp]
_PM._apply_func!(strg, "charge_rating", rescale)
_PM._apply_func!(strg, "current_rating", rescale)
_PM._apply_func!(strg, "discharge_rating", rescale)
_PM._apply_func!(strg, "energy_rating", rescale)
_PM._apply_func!(strg, "energy", rescale)
_PM._apply_func!(strg, "p_loss", rescale)
_PM._apply_func!(strg, "q_loss", rescale)
_PM._apply_func!(strg, "qmax", rescale)
_PM._apply_func!(strg, "qmin", rescale)
_PM._apply_func!(strg, "r", rescale_inverse)
_PM._apply_func!(strg, "stationary_energy_inflow", rescale)
_PM._apply_func!(strg, "stationary_energy_outflow", rescale)
_PM._apply_func!(strg, "thermal_rating", rescale)
_PM._apply_func!(strg, "x", rescale_inverse)
_PM._apply_func!(strg, "ps", rescale_inverse)
_PM._apply_func!(strg, "qs", rescale_inverse)
_PM._apply_func!(strg, "qsc", rescale_inverse)
_PM._apply_func!(strg, "sc", rescale_inverse)
_PM._apply_func!(strg, "sd", rescale_inverse)
_PM._apply_func!(strg, "se", rescale_inverse)
end
end
end
if haskey(data_nw, "load")
for (i, load) in data_nw["load"]
_PM._apply_func!(load, "cost_curt", rescale_inverse)
_PM._apply_func!(load, "cost_red", rescale_inverse)
_PM._apply_func!(load, "cost_shift", rescale_inverse)
_PM._apply_func!(load, "ed", rescale)
_PM._apply_func!(load, "pd", rescale)
_PM._apply_func!(load, "qd", rescale)
_PM._apply_func!(load, "pcurt", rescale)
_PM._apply_func!(load, "pflex", rescale)
_PM._apply_func!(load, "pred", rescale)
_PM._apply_func!(load, "pshift_down", rescale)
_PM._apply_func!(load, "pshift_up", rescale)
end
end
if haskey(data_nw, "shunt")
for (i, shunt) in data_nw["shunt"]
_PM._apply_func!(shunt, "bs", rescale)
_PM._apply_func!(shunt, "gs", rescale)
end
end
if haskey(data_nw, "td_coupling")
td_coupling = data_nw["td_coupling"]
_PM._apply_func!(td_coupling, "p", rescale)
_PM._apply_func!(td_coupling, "q", rescale)
end
end
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1590 | """
make_time_series(data, number_of_periods; loadprofile, genprofile)
Make a time series dict from profile matrices, to be used in `make_multinetwork`.
`number_of_periods` is by default the number of networks specified in `data`.
Profile matrices must have `number_of_periods` rows and one column for each component (load
or generator).
# Arguments
- `data`: a multinetwork data dictionary;
- `number_of_periods = dim_length(data)`;
- `loadprofile = ones(number_of_periods,length(data["load"]))`;
- `genprofile = ones(number_of_periods,length(data["gen"])))`.
"""
function make_time_series(data::Dict{String,Any}, number_of_periods::Int = dim_length(data); loadprofile = ones(number_of_periods,length(data["load"])), genprofile = ones(number_of_periods,length(data["gen"])))
if size(loadprofile) ≠ (number_of_periods, length(data["load"]))
right_size = (number_of_periods, length(data["load"]))
Memento.error(_LOGGER, "Size of loadprofile matrix must be $right_size, found $(size(loadprofile)) instead.")
end
if size(genprofile) ≠ (number_of_periods, length(data["gen"]))
right_size = (number_of_periods, length(data["gen"]))
Memento.error(_LOGGER, "Size of genprofile matrix must be $right_size, found $(size(genprofile)) instead.")
end
return Dict{String,Any}(
"load" => Dict{String,Any}(l => Dict("pd" => load["pd"] .* loadprofile[:, parse(Int, l)]) for (l,load) in data["load"]),
"gen" => Dict{String,Any}(g => Dict("pmax" => gen["pmax"] .* genprofile[:, parse(Int, g)]) for (g,gen) in data["gen"]),
)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 12425 | const dict_candidate_lookup = Dict{String,String}(
"acBuses" => "acBus",
"dcBuses" => "dcBus",
"acBranches" => "acBranch",
"dcBranches" => "dcBranch",
"converters" => "converter",
"transformers" => "acBranch",
"storage" => "storage",
"generators" => "generator",
"flexibleLoads" => "load",
"psts" => "pst",
)
function cand_name_from_dict(dict::String)
dict_candidate_lookup[dict]
end
"""
convert_JSON(file; <keyword arguments>)
convert_JSON(dict; <keyword arguments>)
Convert a JSON `file` or a `dict` conforming to the FlexPlan WP3 API into a FlexPlan.jl dict.
# Arguments
- `oltc::Bool=true`: in distribution networks, whether to add OLTCs with ±10% voltage
regulation to existing and candidate transformers.
- `scale_gen::Real=1.0`: scale factor of all generators.
- `scale_load::Real=1.0`: scale factor of loads.
- `number_of_hours::Union{Int,Nothing}=nothing`: parse only the first hours of the
file/dict.
- `number_of_scenarios::Union{Int,Nothing}=nothing`: parse only the first scenarios of the
file/dict.
- `number_of_years::Union{Int,Nothing}=nothing`: parse only the first years of the
file/dict.
- `cost_scale_factor::Real=1.0`: scale factor for all costs.
- `hour_scale_factor::Real=1.0`: how many hours an optimization period should represent.
- `year_scale_factor::Union{Real,Nothing}=nothing`: how many years a representative year
should represent (default: read from JSON).
- `init_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
target dict after its initialization. They must have exactly one argument (the target
dict) and can modify it; the return value is unused.
- `sn_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
single-network dictionaries containing data for each single year, just before
`_FP.scale_data!` is called. They must have exactly one argument (the single-network dict)
and can modify it; the return value is unused.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
# Extended help
Features of FlexPlan WP3 API not supported in FlexPlan.jl:
- scenario probabilities depending on year (only constant probabilities are allowed);
- number of hours depending on scenario (all scenarios must have the same number of hours);
- PSTs;
- `gridModelInputFile.converters.ratedActivePowerDC`;
- `gridModelInputFile.storage.minEnergy`;
- `gridModelInputFile.storage.maxAbsRamp`;
- `gridModelInputFile.storage.maxInjRamp`;
- uniqueness of candidate components (each candidate can be reinvested at the end of its
lifetime).
"""
function convert_JSON end
function convert_JSON(file::String; kwargs...)
source_dict = JSON.parsefile(file)
convert_JSON(source_dict; kwargs...)
end
function convert_JSON(source::AbstractDict;
oltc::Bool = true,
scale_gen::Real = 1.0,
scale_load::Real = 1.0,
number_of_hours::Union{Int,Nothing} = nothing,
number_of_scenarios::Union{Int,Nothing} = nothing,
number_of_years::Union{Int,Nothing} = nothing,
cost_scale_factor::Real = 1.0,
hour_scale_factor::Real = 1.0,
year_scale_factor::Union{Real,Nothing} = nothing,
init_data_extensions::Vector{<:Function} = Function[],
sn_data_extensions::Vector{<:Function} = Function[],
share_data::Bool = true,
)
# Define target dict
target = Dict{String, Any}(
"nw" => Dict{String,Any}(),
"multinetwork" => true,
"per_unit" => true,
)
# Add dimensions
if length(unique(vcat(source["genericParameters"]["nbHours"]...))) > 1
Memento.error(_LOGGER, "All scenarios must have the same number of hours.") # Dimensions are implemented as a multidimensional array, so the length of one dimension cannot depend on the id along another dimension.
end
if isnothing(number_of_hours)
number_of_hours = first(first(source["genericParameters"]["nbHours"]))
elseif number_of_hours > first(first(source["genericParameters"]["nbHours"]))
Memento.error(_LOGGER, "$number_of_hours hours requested, but only " * string(first(first(source["genericParameters"]["nbHours"]))) * " found in input dict.")
end
_FP.add_dimension!(target, :hour, number_of_hours)
if isnothing(number_of_scenarios)
number_of_scenarios = source["genericParameters"]["nbScenarios"]
elseif number_of_scenarios > source["genericParameters"]["nbScenarios"]
Memento.error(_LOGGER, "$number_of_scenarios scenarios requested, but only " * string(source["genericParameters"]["nbScenarios"]) * " found in input dict.")
end
if haskey(source["genericParameters"], "scenarioProbabilities")
if maximum(length.(unique.(source["genericParameters"]["scenarioProbabilities"]))) > 1
Memento.warn(_LOGGER, "Only constant probabilities are supported for scenarios. Using first year probabilities for every year.")
end
scenario_probabilities = first(first.(source["genericParameters"]["scenarioProbabilities"]), number_of_scenarios) # The outermost `first` is needed if the user has specified a number of scenarios lower than that available.
scenario_probabilities = scenario_probabilities ./ (hour_scale_factor*sum(scenario_probabilities)) # For `_FP.scale_data!` to work properly when applied later, the sum of scenario probabilities must be 1/hour_scale_factor.
else
scenario_probabilities = fill(1/(hour_scale_factor*number_of_scenarios),number_of_scenarios)
end
scenario_properties = Dict(id => Dict{String,Any}("probability"=>prob) for (id,prob) in enumerate(scenario_probabilities))
_FP.add_dimension!(target, :scenario, scenario_properties)
if isnothing(number_of_years)
number_of_years = length(source["genericParameters"]["years"])
elseif number_of_years > length(source["genericParameters"]["years"])
Memento.error(_LOGGER, "$number_of_years years requested, but only " * string(length(source["genericParameters"]["years"])) * " found in input dict.")
end
if isnothing(year_scale_factor)
if haskey(source["genericParameters"], "nbRepresentedYears")
year_scale_factor = source["genericParameters"]["nbRepresentedYears"]
else
Memento.error(_LOGGER, "At least one of JSON attribute `genericParameters.nbRepresentedYears` and function keyword argument `year_scale_factor` must be specified.")
end
end
_FP.add_dimension!(target, :year, number_of_years; metadata = Dict{String,Any}("scale_factor"=>year_scale_factor))
# Generate ID lookup dict
lookup_acBranches = id_lookup(source["gridModelInputFile"]["acBranches"])
lookup = Dict(
"acBuses" => id_lookup(source["gridModelInputFile"]["acBuses"]),
"acBranches" => lookup_acBranches,
"transformers" => id_lookup(source["gridModelInputFile"]["transformers"]; offset=length(lookup_acBranches)), # AC branches are split between `acBranches` and `transformers` dicts in JSON files
"generators" => id_lookup(source["gridModelInputFile"]["generators"]),
"loads" => id_lookup(source["gridModelInputFile"]["loads"]),
"storage" => id_lookup(source["gridModelInputFile"]["storage"]),
"dcBuses" => id_lookup(source["gridModelInputFile"]["dcBuses"]),
"dcBranches" => id_lookup(source["gridModelInputFile"]["dcBranches"]),
"converters" => id_lookup(source["gridModelInputFile"]["converters"]),
)
if haskey(source, "candidatesInputFile")
lookup_cand_acBranches = id_lookup(source["candidatesInputFile"]["acBranches"], "acBranch")
lookup["cand_acBranches"] = lookup_cand_acBranches
lookup["cand_transformers"] = id_lookup(source["candidatesInputFile"]["transformers"], "acBranch"; offset=length(lookup_cand_acBranches)) # AC branches are split between `acBranches` and `transformers` dicts in JSON files
lookup["cand_storage"] = id_lookup(source["candidatesInputFile"]["storage"], "storage")
lookup["cand_dcBranches"] = id_lookup(source["candidatesInputFile"]["dcBranches"], "dcBranch")
lookup["cand_converters"] = id_lookup(source["candidatesInputFile"]["converters"], "converter")
end
# Compute availability of candidates
if haskey(source, "candidatesInputFile")
year_scale_factor = _FP.dim_meta(target, :year, "scale_factor")
year_lookup = Dict{Int,Int}((year,y) for (y,year) in enumerate(source["genericParameters"]["years"]))
cand_availability = Dict{String,Any}(
"acBranches" => availability(source, "acBranches", "acBranch", year_lookup, year_scale_factor, number_of_years),
"transformers" => availability(source, "transformers", "acBranch", year_lookup, year_scale_factor, number_of_years),
"loads" => availability(source, "flexibleLoads", "load", year_lookup, year_scale_factor, number_of_years),
"storage" => availability(source, "storage", "storage", year_lookup, year_scale_factor, number_of_years),
"dcBranches" => availability(source, "dcBranches", "dcBranch", year_lookup, year_scale_factor, number_of_years),
"converters" => availability(source, "converters", "converter", year_lookup, year_scale_factor, number_of_years),
)
end
# Apply init data extensions
for f! in init_data_extensions
f!(target)
end
# Build data year by year
for y in 1:number_of_years
sn_data = haskey(source, "candidatesInputFile") ? nw(source, lookup, cand_availability, y; oltc, scale_gen) : nw(source, lookup, y; oltc, scale_gen)
sn_data["dim"] = target["dim"]
# Apply single network data extensions
for f! in sn_data_extensions
f!(sn_data)
end
_FP.scale_data!(sn_data; year_idx=y, cost_scale_factor)
time_series = make_time_series(source, lookup, y, sn_data; number_of_hours, number_of_scenarios, scale_load)
year_data = _FP.make_multinetwork(sn_data, time_series; number_of_nws=number_of_hours*number_of_scenarios, nw_id_offset=number_of_hours*number_of_scenarios*(y-1), share_data)
add_singular_data!(year_data, source, lookup, y)
_FP.import_nws!(target, year_data)
end
return target
end
# Define a bijective map from existing JSON String ids to FlexPlan Int ids (generated in _id_lookup)
function id_lookup(component_vector::Vector; offset::Int=0)
json_ids = [d["id"] for d in component_vector]
return _id_lookup(json_ids, offset)
end
function id_lookup(component_vector::Vector, sub_key::String; offset::Int=0)
json_ids = [d[sub_key]["id"] for d in component_vector]
return _id_lookup(json_ids, offset)
end
function _id_lookup(json_ids::Vector, offset::Int)
sort!(json_ids) # Sorting prevents changes due to the order of elements in JSON files
int_ids = range(1+offset; length=length(json_ids))
lookup = Dict{String,Int}(zip(json_ids, int_ids))
if length(lookup) < length(json_ids)
Memento.error(_LOGGER, "IDs must be unique (found only $(length(lookup)) unique IDs, should be $(length(json_ids))).")
end
return lookup
end
function availability(source::AbstractDict, comp_name::String, sub_key::String, year_lookup::AbstractDict, year_scale_factor::Int, number_of_years::Int)
target = Dict{String,Vector{Bool}}()
for comp in source["candidatesInputFile"][comp_name]
id = comp[sub_key]["id"]
investment_horizon = [year_lookup[year] for year in comp["horizons"]]
if last(investment_horizon) - first(investment_horizon) ≥ length(investment_horizon)
Memento.warn(_LOGGER, "Horizon of $comp_name $id is not a contiguous set.")
end
raw_lifetime = comp["lifetime"] ÷ year_scale_factor
availability_horizon_start = first(investment_horizon)
availability_horizon_end = min(last(investment_horizon)+raw_lifetime-1, number_of_years)
target[id] = [availability_horizon_start ≤ y ≤ availability_horizon_end for y in 1:number_of_years]
end
return target
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 266 | module JSONConverter
export convert_JSON, convert_JSON_td
using ..FlexPlan
const _FP = FlexPlan
import ..FlexPlan: _LOGGER
import JSON
import Memento
include("base.jl")
include("nw.jl")
include("time_series.jl")
include("singular_data.jl")
include("td.jl")
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 21871 | # Single-network containing only fixed data (i.e. data that does not depend on year), method without candidates
function nw(source::AbstractDict, lookup::AbstractDict, y::Int; oltc::Bool, scale_gen::Real)
target = Dict{String,Any}(
"branch" => Dict{String,Any}(),
"branchdc" => Dict{String,Any}(),
"bus" => Dict{String,Any}(),
"busdc" => Dict{String,Any}(),
"convdc" => Dict{String,Any}(),
"dcline" => Dict{String,Any}(),
"gen" => Dict{String,Any}(),
"load" => Dict{String,Any}(),
"shunt" => Dict{String,Any}(),
"storage" => Dict{String,Any}(),
"switch" => Dict{String,Any}(),
"dcpol" => 2, # Assumption: DC grid has 2 poles.
"per_unit" => true,
"time_elapsed" => 1.0, # Assumption: each period lasts 1 hour.
)
if haskey(source["genericParameters"], "basePower")
target["baseMVA"] = source["genericParameters"]["basePower"]
else
Memento.error(_LOGGER, "\"genericParameters.basePower\" is a required parameter.")
end
# AC branches are split between `acBranches` and `transformers` dicts in JSON files
branch_path = ["gridModelInputFile", "acBranches"]
for comp in walkpath(source, branch_path)
index = lookup["acBranches"][comp["id"]]
source_id = push!(copy(branch_path), comp["id"])
f_bus = lookup["acBuses"][comp["acBusOrigin"]]
t_bus = lookup["acBuses"][comp["acBusExtremity"]]
target["branch"]["$index"] = make_branch(comp, index, source_id, f_bus, t_bus, y; transformer=false)
end
branch_path = ["gridModelInputFile", "transformers"]
for comp in walkpath(source, branch_path)
index = lookup["transformers"][comp["id"]]
source_id = push!(copy(branch_path), comp["id"])
f_bus = lookup["acBuses"][comp["acBusOrigin"]]
t_bus = lookup["acBuses"][comp["acBusExtremity"]]
target["branch"]["$index"] = make_branch(comp, index, source_id, f_bus, t_bus, y; transformer=true, oltc)
end
branchdc_path = ["gridModelInputFile", "dcBranches"]
for comp in walkpath(source, branchdc_path)
index = lookup["dcBranches"][comp["id"]]
source_id = push!(copy(branchdc_path), comp["id"])
fbusdc = lookup["dcBuses"][comp["dcBusOrigin"]]
tbusdc = lookup["dcBuses"][comp["dcBusExtremity"]]
target["branchdc"]["$index"] = make_branchdc(comp, index, source_id, fbusdc, tbusdc, y)
end
bus_path = ["gridModelInputFile", "acBuses"]
for comp in walkpath(source, bus_path)
index = lookup["acBuses"][comp["id"]]
source_id = push!(copy(bus_path), comp["id"])
target["bus"]["$index"] = make_bus(comp, index, source_id)
end
busdc_path = ["gridModelInputFile", "dcBuses"]
for comp in walkpath(source, busdc_path)
index = lookup["dcBuses"][comp["id"]]
source_id = push!(copy(busdc_path), comp["id"])
target["busdc"]["$index"] = make_busdc(comp, index, source_id)
end
convdc_path = ["gridModelInputFile", "converters"]
for comp in walkpath(source, convdc_path)
index = lookup["converters"][comp["id"]]
source_id = push!(copy(convdc_path), comp["id"])
busac = lookup["acBuses"][comp["acBusConnected"]]
busdc = lookup["dcBuses"][comp["dcBusConnected"]]
typeac = walkpath(source, bus_path)[busac]["busType"]
target["convdc"]["$index"] = make_convdc(comp, index, source_id, busac, busdc, typeac, y)
end
gen_path = ["gridModelInputFile", "generators"]
for comp in walkpath(source, gen_path)
index = lookup["generators"][comp["id"]]
source_id = push!(copy(gen_path), comp["id"])
gen_bus = lookup["acBuses"][comp["acBusConnected"]]
target["gen"]["$index"] = make_gen(comp, index, source_id, gen_bus, y; scale_gen)
end
load_path = ["gridModelInputFile", "loads"]
for comp in walkpath(source, load_path)
index = lookup["loads"][comp["id"]]
source_id = push!(copy(load_path), comp["id"])
load_bus = lookup["acBuses"][comp["acBusConnected"]]
target["load"]["$index"] = make_load(comp, index, source_id, load_bus, y)
end
storage_path = ["gridModelInputFile", "storage"]
for comp in walkpath(source, storage_path)
index = lookup["storage"][comp["id"]]
source_id = push!(copy(storage_path), comp["id"])
storage_bus = lookup["acBuses"][comp["acBusConnected"]]
target["storage"]["$index"] = make_storage(comp, index, source_id, storage_bus, y)
end
return target
end
# Single-network containing only fixed data (i.e. data that does not depend on year), method with candidates
function nw(source::AbstractDict, lookup::AbstractDict, cand_availability::AbstractDict, y::Int; oltc::Bool, scale_gen::Real)
target = nw(source, lookup, y; oltc, scale_gen)
target["branchdc_ne"] = Dict{String,Any}()
target["busdc_ne"] = Dict{String,Any}()
target["convdc_ne"] = Dict{String,Any}()
target["ne_branch"] = Dict{String,Any}()
target["ne_storage"] = Dict{String,Any}()
bus_path = ["gridModelInputFile", "acBuses"] # Needed by convdc_ne
ne_branch_path = ["candidatesInputFile", "acBranches"]
for cand in walkpath(source, ne_branch_path)
comp = cand["acBranch"]
if cand_availability["acBranches"][comp["id"]][y]
index = lookup["cand_acBranches"][comp["id"]]
source_id = push!(copy(ne_branch_path), comp["id"])
f_bus = lookup["acBuses"][comp["acBusOrigin"]]
t_bus = lookup["acBuses"][comp["acBusExtremity"]]
t = make_branch(comp, index, source_id, f_bus, t_bus, y; transformer=false)
t["construction_cost"] = cand["invCost"][y]
t["lifetime"] = cand["lifetime"]
t["replace"] = !comp["isTransmission"] # Transmission AC branches are added in parallel to existing branches, if any; distribution AC branches replace existing ones.
target["ne_branch"]["$index"] = t
end
end
ne_branch_path = ["candidatesInputFile", "transformers"]
for cand in walkpath(source, ne_branch_path)
comp = cand["acBranch"]
if cand_availability["transformers"][comp["id"]][y]
index = lookup["cand_transformers"][comp["id"]]
source_id = push!(copy(ne_branch_path), comp["id"])
f_bus = lookup["acBuses"][comp["acBusOrigin"]]
t_bus = lookup["acBuses"][comp["acBusExtremity"]]
t = make_branch(comp, index, source_id, f_bus, t_bus, y; transformer=true, oltc)
t["construction_cost"] = cand["invCost"][y]
t["lifetime"] = cand["lifetime"]
t["replace"] = !comp["isTransmission"] # Transmission transformers are added in parallel to existing transfomers, if any; distribution transformers replace existing ones.
target["ne_branch"]["$index"] = t
end
end
branchdc_ne_path = ["candidatesInputFile", "dcBranches"]
for cand in walkpath(source, branchdc_ne_path)
comp = cand["dcBranch"]
if cand_availability["dcBranches"][comp["id"]][y]
index = lookup["cand_dcBranches"][comp["id"]]
source_id = push!(copy(branchdc_ne_path), comp["id"])
fbusdc = lookup["dcBuses"][comp["dcBusOrigin"]]
tbusdc = lookup["dcBuses"][comp["dcBusExtremity"]]
t = make_branchdc(comp, index, source_id, fbusdc, tbusdc, y)
t["cost"] = cand["invCost"][y]
t["lifetime"] = cand["lifetime"]
target["branchdc_ne"]["$index"] = t
end
end
convdc_ne_path = ["candidatesInputFile", "converters"]
for cand in walkpath(source, convdc_ne_path)
comp = cand["converter"]
if cand_availability["converters"][comp["id"]][y]
index = lookup["cand_converters"][comp["id"]]
source_id = push!(copy(convdc_ne_path), comp["id"])
busac = lookup["acBuses"][comp["acBusConnected"]]
busdc = lookup["dcBuses"][comp["dcBusConnected"]]
typeac = walkpath(source, bus_path)[busac]["busType"]
t = make_convdc(comp, index, source_id, busac, busdc, typeac, y)
t["cost"] = cand["invCost"][y]
t["lifetime"] = cand["lifetime"]
target["convdc_ne"]["$index"] = t
end
end
load_path = ["candidatesInputFile", "flexibleLoads"]
for cand in walkpath(source, load_path)
comp = cand["load"]
if cand_availability["loads"][comp["id"]][y]
index = lookup["loads"][comp["id"]] # Candidate loads have same ids as existing loads
source_id = push!(copy(load_path), comp["id"])
load_bus = lookup["acBuses"][comp["acBusConnected"]]
t = make_load(comp, index, source_id, load_bus, y)
t["cost_inv"] = cand["invCost"][y]
t["lifetime"] = cand["lifetime"]
target["load"]["$index"] = t # The candidate load replaces the existing load that has the same id and is assumed to have the same parameters as that existing load.
end
end
ne_storage_path = ["candidatesInputFile", "storage"]
for cand in walkpath(source, ne_storage_path)
comp = cand["storage"]
if cand_availability["storage"][comp["id"]][y]
index = lookup["cand_storage"][comp["id"]]
source_id = push!(copy(ne_storage_path), comp["id"])
storage_bus = lookup["acBuses"][comp["acBusConnected"]]
t = make_storage(comp, index, source_id, storage_bus, y)
t["eq_cost"] = cand["invCost"][y]
t["inst_cost"] = 0.0
t["lifetime"] = cand["lifetime"]
target["ne_storage"]["$index"] = t
end
end
return target
end
function walkpath(node::AbstractDict, path::Vector{String})
for key in path
node = node[key]
end
return node
end
function optional_value(target::AbstractDict, target_key::String, source::AbstractDict, source_key::String)
if haskey(source, source_key)
target[target_key] = source[source_key]
end
end
function optional_value(target::AbstractDict, target_key::String, source::AbstractDict, source_key::String, y::Int)
if haskey(source, source_key) && !isempty(source[source_key])
target[target_key] = source[source_key][y]
end
end
function make_branch(source::AbstractDict, index::Int, source_id::Vector{String}, f_bus::Int, t_bus::Int, y::Int; transformer::Bool, oltc::Bool=false)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"f_bus" => f_bus,
"t_bus" => t_bus,
"br_status" => 1, # Assumption: all branches defined in JSON file are in service.
"transformer" => transformer,
"b_fr" => 0.0, # Assumption: all branches defined in JSON file have zero shunt susceptance.
"b_to" => 0.0, # Assumption: all branches defined in JSON file have zero shunt susceptance.
"g_fr" => 0.0, # Assumption: all branches defined in JSON file have zero shunt conductance.
"g_to" => 0.0, # Assumption: all branches defined in JSON file have zero shunt conductance.
"rate_a" => source["ratedApparentPower"][y],
"rate_c" => source["emergencyRating"],
"tap" => source["voltageTapRatio"],
"shift" => 0.0, # Assumption: all branches defined in JSON file have zero shift.
"angmin" => source["minAngleDifference"],
"angmax" => source["maxAngleDifference"],
)
optional_value(target, "br_r", source, "resistance")
if source["isTransmission"]
target["br_r"] = 0.0
target["br_x"] = 1/source["susceptance"]
else
target["br_r"] = source["resistance"]
target["br_x"] = source["reactance"]
if transformer && oltc
target["tm_max"] = 1.1
target["tm_min"] = 0.9
end
end
return target
end
function make_branchdc(source::AbstractDict, index::Int, source_id::Vector{String}, fbusdc::Int, tbusdc::Int, y::Int)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"fbusdc" => fbusdc,
"tbusdc" => tbusdc,
"status" => 1, # Assumption: all branches defined in JSON file are in service.
"rateA" => source["ratedActivePower"][y],
"rateC" => source["emergencyRating"],
"r" => 0.0, # Assumption: zero resistance (the parameter is required by PowerModelsACDC but unused in lossless models).
)
return target
end
function make_bus(source::AbstractDict, index::Int, source_id::Vector{String})
target = Dict{String,Any}(
"index" => index,
"bus_i" => index,
"source_id" => source_id,
"vm" => source["nominalVoltageMagnitude"],
)
optional_value(target, "bus_type", source, "busType")
if haskey(target, "bus_type") && target["bus_type"] == 3
target["va"] = 0.0 # Set voltage angle of reference bus to 0.0
end
optional_value(target, "base_kv", source, "baseVoltage")
optional_value(target, "vmax", source, "maxVoltageMagnitude")
optional_value(target, "vmin", source, "minVoltageMagnitude")
if haskey(source, "location")
target["lat"] = source["location"][1]
target["lon"] = source["location"][2]
end
return target
end
function make_busdc(source::AbstractDict, index::Int, source_id::Vector{String})
target = Dict{String,Any}(
"index" => index,
"busdc_i" => index,
"source_id" => source_id,
"Vdc" => source["nominalVoltageMagnitude"],
"Vdcmin" => 0.9, # Assumption: minimum DC voltage is 0.9 p.u. for every DC bus
"Vdcmax" => 1.1, # Assumption: maximum DC voltage is 1.1 p.u. for every DC bus
"Pdc" => 0.0, # Assumption: power withdrawn from DC bus is 0.0 p.u.
)
optional_value(target, "basekVdc", source, "baseVoltage")
return target
end
function make_convdc(source::AbstractDict, index::Int, source_id::Vector{String}, busac::Int, busdc::Int, typeac::Int, y::Int)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"status" => 1, # Assumption: all converters defined in JSON file are in service.
"busac_i" => busac,
"busdc_i" => busdc,
"type_ac" => typeac,
"type_dc" => 3, # Assumption: all converters defined in JSON file have DC droop.
"Vmmin" => 0.9, # Required by PowerModelsACDC, but not relevant, since we use an approximation where voltage magnitude is 1.0 p.u. at each AC transmission network bus
"Vmmax" => 1.1, # Required by PowerModelsACDC, but not relevant, since we use an approximation where voltage magnitude is 1.0 p.u. at each AC transmission network bus
"Pacrated" => source["ratedActivePowerAC"][y],
"Pacmin" => -source["ratedActivePowerAC"][y],
"Pacmax" => source["ratedActivePowerAC"][y],
"Qacrated" => 0.0, # Required by PowerModelsACDC, but unused in active power only models.
"LossA" => source["auxiliaryLosses"][y],
"LossB" => source["linearLosses"][y],
"LossCinv" => 0.0,
"Imax" => 0.0, # Required by PowerModelsACDC, but unused in lossless models.
"transformer" => false, # Assumption: the converter is not a transformer.
"tm" => 0.0, # Required by PowerModelsACDC, but unused, provided that the converter is not a transformer.
"rtf" => 0.0, # Required by PowerModelsACDC, but unused, provided that the converter is not a transformer.
"xtf" => 0.0, # Required by PowerModelsACDC, but unused, provided that the converter is not a transformer.
"reactor" => false, # Assumption: the converter is not a reactor.
"rc" => 0.0, # Required by PowerModelsACDC, but unused, provided that the converter is not a reactor.
"xc" => 0.0, # Required by PowerModelsACDC, but unused, provided that the converter is not a reactor.
"filter" => false, # Required by PowerModelsACDC, but unused, provided that the model is lossless.
"bf" => 0.0, # Required by PowerModelsACDC, but unused, provided that the model is lossless.
"islcc" => 0.0, # Required by PowerModelsACDC, but unused, provided that the model is DC.
)
return target
end
function make_gen(source::AbstractDict, index::Int, source_id::Vector{String}, gen_bus::Int, y::Int; scale_gen::Real)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"gen_status" => 1, # Assumption: all generators defined in JSON file are in service.
"gen_bus" => gen_bus,
"qmin" => source["minReactivePower"][y],
"qmax" => source["maxReactivePower"][y],
"vg" => 1.0,
"model" => 2, # Polynomial cost model
"ncost" => 2, # 2 cost coefficients: c1 and c0
"cost" => [source["generationCosts"][y], 0.0], # [c1, c0]
)
pmin = source["minActivePower"][y]
pmax = source["maxActivePower"][y]
target["pmax"] = scale_gen * pmax
if pmin == pmax # Non-dispatchable generators are characterized in JSON file by having coincident power bounds
target["dispatchable"] = false
target["pmin"] = 0.0 # Must be zero to allow for curtailment
target["cost_curt"] = source["curtailmentCosts"][y]
else
target["dispatchable"] = true
target["pmin"] = scale_gen * pmin
end
return target
end
function make_load(source::AbstractDict, index::Int, source_id::Vector{String}, load_bus::Int, y::Int)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"status" => 1, # Assumption: all loads defined in JSON file are in service.
"load_bus" => load_bus,
"flex" => get(source, "isFlexible", false),
)
if haskey(source, "powerFactor")
target["pf_angle"] = acos(source["powerFactor"])
end
optional_value(target, "pred_rel_max", source, "superiorBoundNCP", y)
optional_value(target, "ered_rel_max", source, "maxEnergyNotConsumed", y)
optional_value(target, "pshift_up_rel_max", source, "superiorBoundUDS", y)
optional_value(target, "pshift_down_rel_max", source, "superiorBoundDDS", y)
optional_value(target, "tshift_up", source, "gracePeriodUDS", y)
optional_value(target, "tshift_down", source, "gracePeriodDDS", y)
optional_value(target, "eshift_rel_max", source, "maxEnergyShifted", y)
optional_value(target, "cost_curt", source, "valueOfLossLoad", y)
optional_value(target, "cost_red", source, "compensationConsumeLess", y)
optional_value(target, "cost_shift", source, "compensationDemandShift", y)
return target
end
function make_storage(source::AbstractDict, index::Int, source_id::Vector{String}, storage_bus::Int, y::Int)
target = Dict{String,Any}(
"index" => index,
"source_id" => source_id,
"status" => 1, # Assumption: all generators defined in JSON file are in service.
"storage_bus" => storage_bus,
"energy_rating" => source["maxEnergy"][y],
"charge_rating" => source["maxAbsActivePower"][y],
"discharge_rating" => source["maxInjActivePower"][y],
"charge_efficiency" => source["absEfficiency"][y],
"discharge_efficiency" => source["injEfficiency"][y],
"qmin" => source["minReactivePowerExchange"][y],
"qmax" => source["maxReactivePowerExchange"][y],
"self_discharge_rate" => source["selfDischargeRate"][y],
"r" => 0.0, # JSON API does not support `r`. Neither Flexplan.jl does (in lossless models), however a value is required by the constraint templates `_PM.constraint_storage_losses` and `_FP.constraint_storage_losses_ne`.
"x" => 0.0, # JSON API does not support `x`. Neither Flexplan.jl does (in lossless models), however a value is required by the constraint templates `_PM.constraint_storage_losses` and `_FP.constraint_storage_losses_ne`.
"p_loss" => 0.0, # JSON API does not support `p_loss`. Neither Flexplan.jl does, however a value is required by the constraint templates `_PM.constraint_storage_losses` and `_FP.constraint_storage_losses_ne`.
"q_loss" => 0.0, # JSON API does not support `q_loss`. Neither Flexplan.jl does, however a value is required by the constraint templates `_PM.constraint_storage_losses` and `_FP.constraint_storage_losses_ne`.
)
# JSON API does not support storage thermal rating, but a value is required by
# `FlexPlan.constraint_storage_thermal_limit`. The following expression prevents it from
# limiting active or reactive power, even in the case of octagonal approximation of
# apparent power.
target["thermal_rating"] = 2 * max(target["charge_rating"], target["discharge_rating"], target["qmax"], -target["qmin"])
optional_value(target, "max_energy_absorption", source, "maxEnergyYear", y)
return target
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1197 | # Data to be added to specific nws only
function add_singular_data!(target::AbstractDict, source::AbstractDict, lookup::AbstractDict, y::Int)
first_hour_nws = string.(_FP.nw_ids(target, hour=1, year=y))
last_hour_nws = string.(_FP.nw_ids(target, hour=_FP.dim_length(target, :hour), year=y))
for comp in source["scenarioDataInputFile"]["storage"]
index = lookup["storage"][comp["id"]]
for s in 1:_FP.dim_length(target, :scenario)
target["nw"][first_hour_nws[s]]["storage"]["$index"]["energy"] = comp["initEnergy"][s][y]
target["nw"][last_hour_nws[s]]["storage"]["$index"]["energy"] = comp["finalEnergy"][s][y]
end
end
if haskey(source, "candidatesInputFile")
for cand in source["candidatesInputFile"]["storage"]
comp = cand["storageData"]
index = lookup["cand_storage"][comp["id"]]
for s in 1:_FP.dim_length(target, :scenario)
target["nw"][first_hour_nws[s]]["ne_storage"]["$index"]["energy"] = comp["initEnergy"][s][y]
target["nw"][last_hour_nws[s]]["ne_storage"]["$index"]["energy"] = comp["finalEnergy"][s][y]
end
end
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 9425 | """
convert_JSON_td(file; <keyword arguments>)
convert_JSON_td(dict; <keyword arguments>)
Convert a JSON `file` or a `dict` conforming to the FlexPlan WP3 API and containing both transmission and distribution networks.
Output:
- one dict containing data related to the transmission network;
- one vector of dicts containing data related to distribution networks.
# Arguments
Refer to `convert_JSON` documentation.
"""
function convert_JSON_td end
function convert_JSON_td(file::String; kwargs...)
source_dict = JSON.parsefile(file)
convert_JSON_td(source_dict; kwargs...)
end
function convert_JSON_td(source::AbstractDict; kwargs...)
source_transmission, source_distribution = split_td(source)
target_transmission = convert_JSON(source_transmission; kwargs...)
target_distribution = Vector{typeof(target_transmission)}(undef,length(source_distribution))
Threads.@threads for i in eachindex(target_distribution)
target_distribution[i] = convert_JSON(source_distribution[i]; kwargs...)
target_distribution[i]["t_bus"] = find_t_bus(source_distribution[i], target_transmission)
end
return target_transmission, target_distribution
end
function find_t_bus(source_distribution, target_transmission)
dist_id = source_distribution["genericParameters"]["thisDistributionNetwork"]
pcc_bus_name = source_distribution["genericParameters"]["allDistributionNetworks"][dist_id]
pcc_bus_id = findfirst(b->last(b["source_id"])==pcc_bus_name, target_transmission["nw"]["1"]["bus"])
return parse(Int,pcc_bus_id)
end
function split_td(source::AbstractDict)
transmission_task = Threads.@spawn extract_transmission(source)
distribution = Vector{Dict{String,Any}}(undef, length(source["genericParameters"]["allDistributionNetworks"]))
dist_info = collect(source["genericParameters"]["allDistributionNetworks"])
Threads.@threads for i in eachindex(dist_info)
dist_id, pcc_bus_id = dist_info[i]
distribution[i] = extract_distribution(source, dist_id, pcc_bus_id)
end
transmission = fetch(transmission_task)
return transmission, distribution
end
function extract_transmission(source::AbstractDict)
transmission = Dict{String,Any}()
transmission["genericParameters"] = source["genericParameters"]
transmission["gridModelInputFile"] = Dict{String,Any}()
transmission["scenarioDataInputFile"] = Dict{String,Any}()
if haskey(source, "candidatesInputFile")
transmission["candidatesInputFile"] = Dict{String,Any}()
end
# Transmission components
for comp in ["dcBuses", "dcBranches", "converters", "psts"]
if haskey(source["gridModelInputFile"], comp)
transmission["gridModelInputFile"][comp] = source["gridModelInputFile"][comp]
end
if haskey(source, "candidatesInputFile") && haskey(source["candidatesInputFile"], comp)
transmission["candidatesInputFile"][comp] = source["candidatesInputFile"][comp]
end
end
# T&D components having `isTransmission` key
for comp in ["acBuses", "acBranches", "transformers"]
if haskey(source["gridModelInputFile"], comp)
transmission["gridModelInputFile"][comp] = filter(device -> device["isTransmission"], source["gridModelInputFile"][comp])
end
if haskey(source, "candidatesInputFile") && haskey(source["candidatesInputFile"], comp)
transmission["candidatesInputFile"][comp] = filter(cand -> cand[cand_name_from_dict(comp)]["isTransmission"], source["candidatesInputFile"][comp])
end
end
# T&D components not having `isTransmission` key
transmission_acBuses = Set(bus["id"] for bus in transmission["gridModelInputFile"]["acBuses"])
for comp in ["storage", "generators", "loads", "flexibleLoads"]
if haskey(source["gridModelInputFile"], comp)
transmission["gridModelInputFile"][comp] = filter(device -> device["acBusConnected"]∈transmission_acBuses, source["gridModelInputFile"][comp])
end
if haskey(source, "candidatesInputFile") && haskey(source["candidatesInputFile"], comp)
transmission["candidatesInputFile"][comp] = filter(cand -> cand[cand_name_from_dict(comp)]["acBusConnected"]∈transmission_acBuses, source["candidatesInputFile"][comp])
end
if haskey(source["scenarioDataInputFile"], comp)
transmission_comp = Set(c["id"] for c in transmission["gridModelInputFile"][comp])
transmission["scenarioDataInputFile"][comp] = filter(device -> device["id"]∈transmission_comp, source["scenarioDataInputFile"][comp])
end
end
return transmission
end
function extract_distribution(source::AbstractDict, dist_id, pcc_bus_id)
dist = Dict{String,Any}()
dist["genericParameters"] = copy(source["genericParameters"])
dist["genericParameters"]["thisDistributionNetwork"] = dist_id # Not in API, but useful.
dist["gridModelInputFile"] = Dict{String,Any}()
dist["scenarioDataInputFile"] = Dict{String,Any}()
if haskey(source, "candidatesInputFile")
dist["candidatesInputFile"] = Dict{String,Any}()
end
# Transmission components
for comp in ["dcBuses", "dcBranches", "converters", "psts"]
dist["gridModelInputFile"][comp] = Vector{Dict{String,Any}}()
if haskey(source, "candidatesInputFile")
dist["candidatesInputFile"][comp] = Vector{Dict{String,Any}}()
end
end
# T&D components having `isTransmission` key
for comp in ["acBuses", "acBranches", "transformers"]
if haskey(source["gridModelInputFile"], comp)
dist["gridModelInputFile"][comp] = filter(device -> !device["isTransmission"]&&device["distributionNetworkId"]==dist_id, source["gridModelInputFile"][comp])
end
if haskey(source, "candidatesInputFile") && haskey(source["candidatesInputFile"], comp)
dist["candidatesInputFile"][comp] = filter(source["candidatesInputFile"][comp]) do cand
device = cand[cand_name_from_dict(comp)]
!device["isTransmission"] && device["distributionNetworkId"]==dist_id
end
end
end
# T&D components not having `isTransmission` key
dist_acBuses = Set(bus["id"] for bus in dist["gridModelInputFile"]["acBuses"])
for comp in ["storage", "generators", "loads", "flexibleLoads"]
if haskey(source["gridModelInputFile"], comp)
dist["gridModelInputFile"][comp] = filter(device -> device["acBusConnected"]∈dist_acBuses, source["gridModelInputFile"][comp])
end
if haskey(source, "candidatesInputFile") && haskey(source["candidatesInputFile"], comp)
dist["candidatesInputFile"][comp] = filter(cand -> cand[cand_name_from_dict(comp)]["acBusConnected"]∈dist_acBuses, source["candidatesInputFile"][comp])
end
if haskey(source["scenarioDataInputFile"], comp)
dist_comp = Set(c["id"] for c in dist["gridModelInputFile"][comp])
dist["scenarioDataInputFile"][comp] = filter(device -> device["id"]∈dist_comp, source["scenarioDataInputFile"][comp])
end
end
# Add reference bus
pos = findfirst(bus -> bus["id"]==pcc_bus_id, source["gridModelInputFile"]["acBuses"])
pcc_bus = copy(source["gridModelInputFile"]["acBuses"][pos]) # Original in transmission must not be modified
pcc_bus["busType"] = 3 # Slack bus
push!(dist["gridModelInputFile"]["acBuses"], pcc_bus)
# Add reference generator
number_of_years = length(source["genericParameters"]["years"])
init = zeros(number_of_years)
rated_power = (
sum(t["ratedApparentPower"] for t in dist["gridModelInputFile"]["transformers"] if !t["isTransmission"] && (t["acBusOrigin"]==pcc_bus_id || t["acBusExtremity"]==pcc_bus_id); init)
+ sum(t["acBranch"]["ratedApparentPower"] for t in dist["candidatesInputFile"]["transformers"] if !t["acBranch"]["isTransmission"] && (t["acBranch"]["acBusOrigin"]==pcc_bus_id || t["acBranch"]["acBusExtremity"]==pcc_bus_id); init)
+ sum(b["ratedApparentPower"] for b in dist["gridModelInputFile"]["acBranches"] if !b["isTransmission"] && (b["acBusOrigin"]==pcc_bus_id || b["acBusExtremity"]==pcc_bus_id); init)
+ sum(b["acBranch"]["ratedApparentPower"] for b in dist["candidatesInputFile"]["acBranches"] if !b["acBranch"]["isTransmission"] && (b["acBranch"]["acBusOrigin"]==pcc_bus_id || b["acBranch"]["acBusExtremity"]==pcc_bus_id); init)
)
estimated_cost = source["genericParameters"]["estimateCostTdExchange"]
pcc_gen = Dict{String,Any}(
"id" => "PCC",
"acBusConnected" => pcc_bus["id"],
"maxActivePower" => rated_power, # Will be limited by distribution network constraints, no need for a tight bound here.
"minActivePower" => -rated_power, # Will be limited by distribution network constraints, no need for a tight bound here.
"maxReactivePower" => rated_power, # Will be limited by distribution network constraints, no need for a tight bound here.
"minReactivePower" => -rated_power, # Will be limited by distribution network constraints, no need for a tight bound here.
"generationCosts" => repeat([estimated_cost], number_of_years),
"curtailmentCosts" => zeros(number_of_years)
)
push!(dist["gridModelInputFile"]["generators"], pcc_gen)
return dist
end | FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 4235 | # Time series having length == number_of_hours * number_of_scenarios
function make_time_series(source::AbstractDict, lookup::AbstractDict, y::Int, sn_data::AbstractDict; number_of_hours::Int, number_of_scenarios::Int, scale_load::Real)
target = Dict{String,Any}(
"gen" => Dict{String,Any}(),
"load" => Dict{String,Any}(),
"storage" => Dict{String,Any}(),
)
for comp in source["scenarioDataInputFile"]["generators"]
index = lookup["generators"][comp["id"]]
if haskey(comp, "capacityFactor")
p = ts_vector(comp, "capacityFactor", y; number_of_hours, number_of_scenarios) .* sn_data["gen"]["$index"]["pmax"]
target["gen"]["$index"] = Dict{String,Any}("pmax" => p)
end
end
for comp in source["scenarioDataInputFile"]["loads"]
index = lookup["loads"][comp["id"]]
pd = ts_vector(comp, "demandReference", y; number_of_hours, number_of_scenarios)
target["load"]["$index"] = Dict{String,Any}("pd" => scale_load*pd)
end
for comp in source["scenarioDataInputFile"]["storage"]
index = lookup["storage"][comp["id"]]
target["storage"]["$index"] = Dict{String,Any}()
if haskey(comp, "powerExternalProcess")
p = ts_vector(comp, "powerExternalProcess", y; number_of_hours, number_of_scenarios)
target["storage"]["$index"]["stationary_energy_inflow"] = max.(p,0.0)
target["storage"]["$index"]["stationary_energy_outflow"] = -min.(p,0.0)
else
target["storage"]["$index"]["stationary_energy_inflow"] = zeros(number_of_hours*number_of_scenarios)
target["storage"]["$index"]["stationary_energy_outflow"] = zeros(number_of_hours*number_of_scenarios)
end
if haskey(comp, "maxAbsActivePower")
target["storage"]["$index"]["charge_rating"] = ts_vector(comp, "maxAbsActivePower", y; number_of_hours, number_of_scenarios) * sn_data["storage"]["$index"]["charge_rating"]
end
if haskey(comp, "maxInjActivePower")
target["storage"]["$index"]["discharge_rating"] = ts_vector(comp, "maxInjActivePower", y; number_of_hours, number_of_scenarios) * sn_data["storage"]["$index"]["discharge_rating"]
end
end
if haskey(source, "candidatesInputFile")
target["ne_storage"] = Dict{String,Any}()
for cand in source["candidatesInputFile"]["storage"]
comp = cand["storageData"]
index = lookup["cand_storage"][comp["id"]]
target["ne_storage"]["$index"] = Dict{String,Any}()
if haskey(comp, "powerExternalProcess")
p = ts_vector(comp, "powerExternalProcess", y; number_of_hours, number_of_scenarios)
target["ne_storage"]["$index"]["stationary_energy_inflow"] = max.(p,0.0)
target["ne_storage"]["$index"]["stationary_energy_outflow"] = -min.(p,0.0)
else
target["ne_storage"]["$index"]["stationary_energy_inflow"] = zeros(number_of_hours*number_of_scenarios)
target["ne_storage"]["$index"]["stationary_energy_outflow"] = zeros(number_of_hours*number_of_scenarios)
end
if haskey(comp, "maxAbsActivePower")
target["ne_storage"]["$index"]["charge_rating"] = ts_vector(comp, "maxAbsActivePower", y; number_of_hours, number_of_scenarios) * sn_data["ne_storage"]["$index"]["charge_rating"]
end
if haskey(comp, "maxInjActivePower")
target["ne_storage"]["$index"]["discharge_rating"] = ts_vector(comp, "maxInjActivePower", y; number_of_hours, number_of_scenarios) * sn_data["ne_storage"]["$index"]["discharge_rating"]
end
end
end
return target
end
# Time series data vector from JSON component dict
function ts_vector(comp::AbstractDict, key::String, y::Int; number_of_hours::Int, number_of_scenarios::Int)
# `comp[key]` is a Vector{Vector{Vector{Any}}} containing data of each scenario, year and hour
# Returned value is a Vector{Float64} containing data of year y and each scenario and hour
return mapreduce(Vector{Float64}, vcat, comp[key][s][y][1:number_of_hours] for s in 1:number_of_scenarios)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 19312 | "TNEP with flexible loads and storage, for transmission networks"
function flex_tnep(data::Dict{String,Any}, model_type::Type, optimizer; kwargs...)
require_dim(data, :hour, :year)
return _PM.solve_model(
data, model_type, optimizer, build_flex_tnep;
ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"TNEP with flexible loads and storage, for distribution networks"
function flex_tnep(data::Dict{String,Any}, model_type::Type{<:_PM.AbstractBFModel}, optimizer; kwargs...)
require_dim(data, :hour, :year)
return _PM.solve_model(
data, model_type, optimizer, build_flex_tnep;
ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"TNEP with flexible loads and storage, combines transmission and distribution networks"
function flex_tnep(t_data::Dict{String,Any}, d_data::Vector{Dict{String,Any}}, t_model_type::Type{<:_PM.AbstractPowerModel}, d_model_type::Type{<:_PM.AbstractPowerModel}, optimizer; kwargs...)
require_dim(t_data, :hour, :year)
for data in d_data
require_dim(data, :hour, :year)
end
return solve_model(
t_data, d_data, t_model_type, d_model_type, optimizer, build_flex_tnep;
t_ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
d_ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
t_solution_processors = [_PM.sol_data_model!],
d_solution_processors = [_PM.sol_data_model!, sol_td_coupling!],
kwargs...
)
end
# Here the problem is defined, which is then sent to the solver.
# It is basically a declaration of variables and constraints of the problem
"Builds transmission model."
function build_flex_tnep(pm::_PM.AbstractPowerModel; objective::Bool=true)
# VARIABLES: defined within PowerModels(ACDC) can directly be used, other variables need to be defined in the according sections of the code
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
# DC bus
_PMACDC.variable_dcgrid_voltage_magnitude(pm; nw = n)
# DC branch
_PMACDC.variable_active_dcbranch_flow(pm; nw = n)
_PMACDC.variable_dcbranch_current(pm; nw = n)
# AC-DC converter
_PMACDC.variable_dc_converter(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n)
_PM.variable_ne_branch_voltage(pm; nw = n)
# Candidate DC bus
_PMACDC.variable_dcgrid_voltage_magnitude_ne(pm; nw = n)
# Candidate DC branch
variable_ne_branchdc_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PMACDC.variable_branch_ne().
_PMACDC.variable_active_dcbranch_flow_ne(pm; nw = n)
_PMACDC.variable_dcbranch_current_ne(pm; nw = n)
# Candidate AC-DC converter
variable_dc_converter_ne(pm; nw = n) # FlexPlan version: replaces _PMACDC.variable_dc_converter_ne().
_PMACDC.variable_voltage_slack(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n)
variable_energy_not_consumed(pm; nw = n)
variable_total_demand_shifting_upwards(pm; nw = n)
variable_total_demand_shifting_downwards(pm; nw = n)
end
# OBJECTIVE: see objective.jl
if objective
objective_min_cost_flex(pm)
end
# CONSTRAINTS: defined within PowerModels(ACDC) can directly be used, other constraints need to be defined in the according sections of the code
for n in nw_ids(pm)
_PM.constraint_model_voltage(pm; nw = n)
_PM.constraint_ne_model_voltage(pm; nw = n)
_PMACDC.constraint_voltage_dc(pm; nw = n)
_PMACDC.constraint_voltage_dc_ne(pm; nw = n)
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_dcne_flex(pm, i; nw = n)
end
if haskey(pm.setting, "allow_line_replacement") && pm.setting["allow_line_replacement"] == true
for i in _PM.ids(pm, n, :branch)
constraint_ohms_yt_from_repl(pm, i; nw = n)
constraint_ohms_yt_to_repl(pm, i; nw = n)
constraint_voltage_angle_difference_repl(pm, i; nw = n)
constraint_thermal_limit_from_repl(pm, i; nw = n)
constraint_thermal_limit_to_repl(pm, i; nw = n)
end
else
for i in _PM.ids(pm, n, :branch)
_PM.constraint_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ohms_yt_to(pm, i; nw = n)
_PM.constraint_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_thermal_limit_from(pm, i; nw = n)
_PM.constraint_thermal_limit_to(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :ne_branch)
_PM.constraint_ne_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ne_ohms_yt_to(pm, i; nw = n)
_PM.constraint_ne_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_from(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_to(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc)
_PMACDC.constraint_power_balance_dc_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc_ne)
_PMACDC.constraint_power_balance_dcne_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc)
_PMACDC.constraint_ohms_dc_branch(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc_ne)
_PMACDC.constraint_ohms_dc_branch_ne(pm, i; nw = n)
_PMACDC.constraint_branch_limit_on_off(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :convdc)
_PMACDC.constraint_converter_losses(pm, i; nw = n)
_PMACDC.constraint_converter_current(pm, i; nw = n)
_PMACDC.constraint_conv_transformer(pm, i; nw = n)
_PMACDC.constraint_conv_reactor(pm, i; nw = n)
_PMACDC.constraint_conv_filter(pm, i; nw = n)
if _PM.ref(pm,n,:convdc,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :convdc_ne)
_PMACDC.constraint_converter_losses_ne(pm, i; nw = n)
_PMACDC.constraint_converter_current_ne(pm, i; nw = n)
_PMACDC.constraint_converter_limit_on_off(pm, i; nw = n)
_PMACDC.constraint_conv_transformer_ne(pm, i; nw = n)
_PMACDC.constraint_conv_reactor_ne(pm, i; nw = n)
_PMACDC.constraint_conv_filter_ne(pm, i; nw = n)
if _PM.ref(pm,n,:convdc_ne,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle_ne(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, nw = n)
constraint_shift_up_state(pm, i, nw = n)
constraint_shift_down_state(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_state_final(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, prev_n, n)
constraint_shift_up_state(pm, i, prev_n, n)
constraint_shift_down_state(pm, i, prev_n, n)
constraint_shift_duration(pm, i, first_n, n)
end
end
# Constraints on investments
if is_first_id(pm, n, :hour)
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds distribution model."
function build_flex_tnep(pm::_PM.AbstractBFModel; objective::Bool=true, intertemporal_constraints::Bool=true)
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
_PM.variable_branch_current(pm; nw = n)
variable_oltc_branch_transform(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n, bounded = false) # Bounds computed here would be too limiting in the case of ne_branches added in parallel
variable_ne_branch_current(pm; nw = n)
variable_oltc_ne_branch_transform(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n)
variable_energy_not_consumed(pm; nw = n)
variable_total_demand_shifting_upwards(pm; nw = n)
variable_total_demand_shifting_downwards(pm; nw = n)
end
if objective
objective_min_cost_flex(pm)
end
for n in nw_ids(pm)
_PM.constraint_model_current(pm; nw = n)
constraint_ne_model_current(pm; nw = n)
if haskey(dim_prop(pm), :sub_nw)
constraint_td_coupling_power_reactive_bounds(pm, get(pm.setting, "qs_ratio_bound", 0.48); nw = n)
end
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_flex(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branch)
constraint_dist_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :ne_branch)
constraint_dist_ne_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if intertemporal_constraints
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, nw = n)
constraint_shift_up_state(pm, i, nw = n)
constraint_shift_down_state(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_state_final(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, prev_n, n)
constraint_shift_up_state(pm, i, prev_n, n)
constraint_shift_down_state(pm, i, prev_n, n)
constraint_shift_duration(pm, i, first_n, n)
end
end
end
# Constraints on investments
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n)
end
end
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds combined transmission and distribution model."
function build_flex_tnep(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel)
# Transmission variables and constraints
build_flex_tnep(t_pm; objective = false)
# Distribution variables and constraints
build_flex_tnep(d_pm; objective = false)
# Variables related to the combined model
# (No new variables are needed here.)
# Constraints related to the combined model
constraint_td_coupling(t_pm, d_pm)
# Objective function of the combined model
objective_min_cost_flex(t_pm, d_pm)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 21365 | "Multi-scenario TNEP with flexible loads and storage, for transmission networks"
function simple_stoch_flex_tnep(data::Dict{String,Any}, model_type::Type, optimizer; kwargs...)
require_dim(data, :hour, :scenario, :year)
return _PM.solve_model(
data, model_type, optimizer, build_simple_stoch_flex_tnep;
ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"Multi-scenario TNEP with flexible loads and storage, for distribution networks"
function simple_stoch_flex_tnep(data::Dict{String,Any}, model_type::Type{<:_PM.AbstractBFModel}, optimizer; kwargs...)
require_dim(data, :hour, :scenario, :year)
return _PM.solve_model(
data, model_type, optimizer, build_simple_stoch_flex_tnep;
ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"Multi-scenario TNEP with flexible loads and storage, combines transmission and distribution networks"
function simple_stoch_flex_tnep(t_data::Dict{String,Any}, d_data::Vector{Dict{String,Any}}, t_model_type::Type{<:_PM.AbstractPowerModel}, d_model_type::Type{<:_PM.AbstractPowerModel}, optimizer; kwargs...)
require_dim(t_data, :hour, :scenario, :year)
for data in d_data
require_dim(data, :hour, :scenario, :year)
end
return solve_model(
t_data, d_data, t_model_type, d_model_type, optimizer, build_simple_stoch_flex_tnep;
t_ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
d_ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
t_solution_processors = [_PM.sol_data_model!],
d_solution_processors = [_PM.sol_data_model!, sol_td_coupling!],
kwargs...
)
end
# Here the problem is defined, which is then sent to the solver.
# It is basically a declaration of variables and constraints of the problem
"Builds transmission model."
function build_simple_stoch_flex_tnep(pm::_PM.AbstractPowerModel; objective::Bool=true, investment::Bool=true)
# VARIABLES: defined within PowerModels(ACDC) can directly be used, other variables need to be defined in the according sections of the code
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
# DC bus
_PMACDC.variable_dcgrid_voltage_magnitude(pm; nw = n)
# DC branch
_PMACDC.variable_active_dcbranch_flow(pm; nw = n)
_PMACDC.variable_dcbranch_current(pm; nw = n)
# AC-DC converter
_PMACDC.variable_dc_converter(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
investment && variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n)
_PM.variable_ne_branch_voltage(pm; nw = n)
# Candidate DC bus
_PMACDC.variable_dcgrid_voltage_magnitude_ne(pm; nw = n)
# Candidate DC branch
investment && variable_ne_branchdc_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PMACDC.variable_branch_ne().
_PMACDC.variable_active_dcbranch_flow_ne(pm; nw = n)
_PMACDC.variable_dcbranch_current_ne(pm; nw = n)
# Candidate AC-DC converter
variable_dc_converter_ne(pm; nw = n, investment) # FlexPlan version: replaces _PMACDC.variable_dc_converter_ne().
_PMACDC.variable_voltage_slack(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n, investment)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n, investment)
end
# OBJECTIVE: see objective.jl
if objective
objective_stoch_flex(pm; investment, operation=true)
end
# CONSTRAINTS: defined within PowerModels(ACDC) can directly be used, other constraints need to be defined in the according sections of the code
for n in nw_ids(pm)
_PM.constraint_model_voltage(pm; nw = n)
_PM.constraint_ne_model_voltage(pm; nw = n)
_PMACDC.constraint_voltage_dc(pm; nw = n)
_PMACDC.constraint_voltage_dc_ne(pm; nw = n)
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_dcne_flex(pm, i; nw = n)
end
if haskey(pm.setting, "allow_line_replacement") && pm.setting["allow_line_replacement"] == true
for i in _PM.ids(pm, n, :branch)
constraint_ohms_yt_from_repl(pm, i; nw = n)
constraint_ohms_yt_to_repl(pm, i; nw = n)
constraint_voltage_angle_difference_repl(pm, i; nw = n)
constraint_thermal_limit_from_repl(pm, i; nw = n)
constraint_thermal_limit_to_repl(pm, i; nw = n)
end
else
for i in _PM.ids(pm, n, :branch)
_PM.constraint_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ohms_yt_to(pm, i; nw = n)
_PM.constraint_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_thermal_limit_from(pm, i; nw = n)
_PM.constraint_thermal_limit_to(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :ne_branch)
_PM.constraint_ne_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ne_ohms_yt_to(pm, i; nw = n)
_PM.constraint_ne_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_from(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_to(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc)
_PMACDC.constraint_power_balance_dc_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc_ne)
_PMACDC.constraint_power_balance_dcne_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc)
_PMACDC.constraint_ohms_dc_branch(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc_ne)
_PMACDC.constraint_ohms_dc_branch_ne(pm, i; nw = n)
_PMACDC.constraint_branch_limit_on_off(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :convdc)
_PMACDC.constraint_converter_losses(pm, i; nw = n)
_PMACDC.constraint_converter_current(pm, i; nw = n)
_PMACDC.constraint_conv_transformer(pm, i; nw = n)
_PMACDC.constraint_conv_reactor(pm, i; nw = n)
_PMACDC.constraint_conv_filter(pm, i; nw = n)
if _PM.ref(pm,n,:convdc,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :convdc_ne)
_PMACDC.constraint_converter_losses_ne(pm, i; nw = n)
_PMACDC.constraint_converter_current_ne(pm, i; nw = n)
_PMACDC.constraint_converter_limit_on_off(pm, i; nw = n)
_PMACDC.constraint_conv_transformer_ne(pm, i; nw = n)
_PMACDC.constraint_conv_reactor_ne(pm, i; nw = n)
_PMACDC.constraint_conv_filter_ne(pm, i; nw = n)
if _PM.ref(pm,n,:convdc_ne,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle_ne(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_balance_periodic(pm, i, get(pm.setting, "demand_shifting_balance_period", 24), nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
end
# Constraints on investments
if investment && is_first_id(pm, n, :hour, :scenario)
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds distribution model."
function build_simple_stoch_flex_tnep(pm::_PM.AbstractBFModel; objective::Bool=true, investment::Bool=true, intertemporal_constraints::Bool=true)
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
_PM.variable_branch_current(pm; nw = n)
variable_oltc_branch_transform(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
investment && variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax = true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n, bounded = false) # Bounds computed here would be too limiting in the case of ne_branches added in parallel
variable_ne_branch_current(pm; nw = n)
variable_oltc_ne_branch_transform(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n, investment)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n, investment)
end
if objective
objective_stoch_flex(pm; investment, operation=true)
end
for n in nw_ids(pm)
_PM.constraint_model_current(pm; nw = n)
constraint_ne_model_current(pm; nw = n)
if haskey(dim_prop(pm), :sub_nw)
constraint_td_coupling_power_reactive_bounds(pm, get(pm.setting, "qs_ratio_bound", 0.48); nw = n)
end
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_flex(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branch)
constraint_dist_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :ne_branch)
constraint_dist_ne_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if intertemporal_constraints
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_balance_periodic(pm, i, get(pm.setting, "demand_shifting_balance_period", 24), nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
end
end
# Constraints on investments
if investment && is_first_id(pm, n, :hour, :scenario)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n)
end
end
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds combined transmission and distribution model."
function build_simple_stoch_flex_tnep(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel)
# Transmission variables and constraints
build_simple_stoch_flex_tnep(t_pm; objective = false)
# Distribution variables and constraints
build_simple_stoch_flex_tnep(d_pm; objective = false)
# Variables related to the combined model
# (No new variables are needed here.)
# Constraints related to the combined model
constraint_td_coupling(t_pm, d_pm)
# Objective function of the combined model
objective_stoch_flex(t_pm, d_pm)
end
"Main problem model in Benders decomposition, for transmission networks."
function build_simple_stoch_flex_tnep_benders_main(pm::_PM.AbstractPowerModel)
for n in nw_ids(pm; hour=1, scenario=1)
variable_ne_branch_indicator(pm; nw = n, relax=true)
variable_ne_branch_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true)
variable_ne_branchdc_investment(pm; nw = n)
variable_ne_converter_indicator(pm; nw = n, relax=true)
variable_ne_converter_investment(pm; nw = n)
variable_storage_indicator(pm; nw = n, relax=true)
variable_storage_investment(pm; nw = n)
variable_flexible_demand_indicator(pm; nw = n, relax=true)
variable_flexible_demand_investment(pm; nw = n)
end
objective_stoch_flex(pm; investment=true, operation=false)
for n in nw_ids(pm; hour=1, scenario=1)
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
"Main problem model in Benders decomposition, for distribution networks."
function build_simple_stoch_flex_tnep_benders_main(pm::_PM.AbstractBFModel)
for n in nw_ids(pm; hour=1, scenario=1)
variable_ne_branch_indicator(pm; nw = n, relax=true)
variable_ne_branch_investment(pm; nw = n)
variable_storage_indicator(pm; nw = n, relax=true)
variable_storage_investment(pm; nw = n)
variable_flexible_demand_indicator(pm; nw = n, relax=true)
variable_flexible_demand_investment(pm; nw = n)
end
objective_stoch_flex(pm; investment=true, operation=false)
for n in nw_ids(pm; hour=1, scenario=1)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n) # Needed to avoid infeasibility in secondary problems
end
end
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
"Secondary problem model in Benders decomposition - suitable for both transmission and distribution."
function build_simple_stoch_flex_tnep_benders_secondary(pm::_PM.AbstractPowerModel)
build_simple_stoch_flex_tnep(pm; investment=false)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 22846 | "Multi-scenario TNEP with flexible loads and storage, for transmission networks"
function stoch_flex_tnep(data::Dict{String,Any}, model_type::Type, optimizer; kwargs...)
require_dim(data, :hour, :scenario, :year)
return _PM.solve_model(
data, model_type, optimizer, build_stoch_flex_tnep;
ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"Multi-scenario TNEP with flexible loads and storage, for distribution networks"
function stoch_flex_tnep(data::Dict{String,Any}, model_type::Type{<:_PM.AbstractBFModel}, optimizer; kwargs...)
require_dim(data, :hour, :scenario, :year)
return _PM.solve_model(
data, model_type, optimizer, build_stoch_flex_tnep;
ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"Multi-scenario TNEP with flexible loads and storage, combines transmission and distribution networks"
function stoch_flex_tnep(t_data::Dict{String,Any}, d_data::Vector{Dict{String,Any}}, t_model_type::Type{<:_PM.AbstractPowerModel}, d_model_type::Type{<:_PM.AbstractPowerModel}, optimizer; kwargs...)
require_dim(t_data, :hour, :scenario, :year)
for data in d_data
require_dim(data, :hour, :scenario, :year)
end
return solve_model(
t_data, d_data, t_model_type, d_model_type, optimizer, build_stoch_flex_tnep;
t_ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
d_ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!, ref_add_flex_load!],
t_solution_processors = [_PM.sol_data_model!],
d_solution_processors = [_PM.sol_data_model!, sol_td_coupling!],
kwargs...
)
end
# Here the problem is defined, which is then sent to the solver.
# It is basically a declaration of variables and constraints of the problem
"Builds transmission model."
function build_stoch_flex_tnep(pm::_PM.AbstractPowerModel; objective::Bool=true, investment::Bool=true)
# VARIABLES: defined within PowerModels(ACDC) can directly be used, other variables need to be defined in the according sections of the code
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
# DC bus
_PMACDC.variable_dcgrid_voltage_magnitude(pm; nw = n)
# DC branch
_PMACDC.variable_active_dcbranch_flow(pm; nw = n)
_PMACDC.variable_dcbranch_current(pm; nw = n)
# AC-DC converter
_PMACDC.variable_dc_converter(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
investment && variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n)
_PM.variable_ne_branch_voltage(pm; nw = n)
# Candidate DC bus
_PMACDC.variable_dcgrid_voltage_magnitude_ne(pm; nw = n)
# Candidate DC branch
investment && variable_ne_branchdc_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PMACDC.variable_branch_ne().
_PMACDC.variable_active_dcbranch_flow_ne(pm; nw = n)
_PMACDC.variable_dcbranch_current_ne(pm; nw = n)
# Candidate AC-DC converter
variable_dc_converter_ne(pm; nw = n, investment) # FlexPlan version: replaces _PMACDC.variable_dc_converter_ne().
_PMACDC.variable_voltage_slack(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n, investment)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n, investment)
variable_energy_not_consumed(pm; nw = n)
variable_total_demand_shifting_upwards(pm; nw = n)
variable_total_demand_shifting_downwards(pm; nw = n)
end
# OBJECTIVE: see objective.jl
if objective
objective_stoch_flex(pm; investment, operation=true)
end
# CONSTRAINTS: defined within PowerModels(ACDC) can directly be used, other constraints need to be defined in the according sections of the code
for n in nw_ids(pm)
_PM.constraint_model_voltage(pm; nw = n)
_PM.constraint_ne_model_voltage(pm; nw = n)
_PMACDC.constraint_voltage_dc(pm; nw = n)
_PMACDC.constraint_voltage_dc_ne(pm; nw = n)
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_dcne_flex(pm, i; nw = n)
end
if haskey(pm.setting, "allow_line_replacement") && pm.setting["allow_line_replacement"] == true
for i in _PM.ids(pm, n, :branch)
constraint_ohms_yt_from_repl(pm, i; nw = n)
constraint_ohms_yt_to_repl(pm, i; nw = n)
constraint_voltage_angle_difference_repl(pm, i; nw = n)
constraint_thermal_limit_from_repl(pm, i; nw = n)
constraint_thermal_limit_to_repl(pm, i; nw = n)
end
else
for i in _PM.ids(pm, n, :branch)
_PM.constraint_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ohms_yt_to(pm, i; nw = n)
_PM.constraint_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_thermal_limit_from(pm, i; nw = n)
_PM.constraint_thermal_limit_to(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :ne_branch)
_PM.constraint_ne_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ne_ohms_yt_to(pm, i; nw = n)
_PM.constraint_ne_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_from(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_to(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc)
_PMACDC.constraint_power_balance_dc_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc_ne)
_PMACDC.constraint_power_balance_dcne_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc)
_PMACDC.constraint_ohms_dc_branch(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc_ne)
_PMACDC.constraint_ohms_dc_branch_ne(pm, i; nw = n)
_PMACDC.constraint_branch_limit_on_off(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :convdc)
_PMACDC.constraint_converter_losses(pm, i; nw = n)
_PMACDC.constraint_converter_current(pm, i; nw = n)
_PMACDC.constraint_conv_transformer(pm, i; nw = n)
_PMACDC.constraint_conv_reactor(pm, i; nw = n)
_PMACDC.constraint_conv_filter(pm, i; nw = n)
if _PM.ref(pm,n,:convdc,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :convdc_ne)
_PMACDC.constraint_converter_losses_ne(pm, i; nw = n)
_PMACDC.constraint_converter_current_ne(pm, i; nw = n)
_PMACDC.constraint_converter_limit_on_off(pm, i; nw = n)
_PMACDC.constraint_conv_transformer_ne(pm, i; nw = n)
_PMACDC.constraint_conv_reactor_ne(pm, i; nw = n)
_PMACDC.constraint_conv_filter_ne(pm, i; nw = n)
if _PM.ref(pm,n,:convdc_ne,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle_ne(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, nw = n)
constraint_shift_up_state(pm, i, nw = n)
constraint_shift_down_state(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_state_final(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, prev_n, n)
constraint_shift_up_state(pm, i, prev_n, n)
constraint_shift_down_state(pm, i, prev_n, n)
constraint_shift_duration(pm, i, first_n, n)
end
end
# Constraints on investments
if investment && is_first_id(pm, n, :hour, :scenario)
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds distribution model."
function build_stoch_flex_tnep(pm::_PM.AbstractBFModel; objective::Bool=true, investment::Bool=true, intertemporal_constraints::Bool=true)
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
_PM.variable_branch_current(pm; nw = n)
variable_oltc_branch_transform(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
investment && variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax = true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n, bounded = false) # Bounds computed here would be too limiting in the case of ne_branches added in parallel
variable_ne_branch_current(pm; nw = n)
variable_oltc_ne_branch_transform(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n, investment)
variable_absorbed_energy_ne(pm; nw = n)
# Flexible demand
variable_flexible_demand(pm; nw = n, investment)
variable_energy_not_consumed(pm; nw = n)
variable_total_demand_shifting_upwards(pm; nw = n)
variable_total_demand_shifting_downwards(pm; nw = n)
end
if objective
objective_stoch_flex(pm; investment, operation=true)
end
for n in nw_ids(pm)
_PM.constraint_model_current(pm; nw = n)
constraint_ne_model_current(pm; nw = n)
if haskey(dim_prop(pm), :sub_nw)
constraint_td_coupling_power_reactive_bounds(pm, get(pm.setting, "qs_ratio_bound", 0.48); nw = n)
end
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_flex(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branch)
constraint_dist_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :ne_branch)
constraint_dist_ne_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :flex_load)
constraint_total_flexible_demand(pm, i; nw = n)
constraint_flex_bounds_ne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :fixed_load)
constraint_total_fixed_demand(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if intertemporal_constraints
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, nw = n)
constraint_shift_up_state(pm, i, nw = n)
constraint_shift_down_state(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_shift_state_final(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
first_n = first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :flex_load, nw = n)
constraint_red_state(pm, i, prev_n, n)
constraint_shift_up_state(pm, i, prev_n, n)
constraint_shift_down_state(pm, i, prev_n, n)
constraint_shift_duration(pm, i, first_n, n)
end
end
end
# Constraints on investments
if investment && is_first_id(pm, n, :hour, :scenario)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n)
end
end
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds combined transmission and distribution model."
function build_stoch_flex_tnep(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel)
# Transmission variables and constraints
build_stoch_flex_tnep(t_pm; objective = false)
# Distribution variables and constraints
build_stoch_flex_tnep(d_pm; objective = false)
# Variables related to the combined model
# (No new variables are needed here.)
# Constraints related to the combined model
constraint_td_coupling(t_pm, d_pm)
# Objective function of the combined model
objective_stoch_flex(t_pm, d_pm)
end
"Main problem model in Benders decomposition, for transmission networks."
function build_stoch_flex_tnep_benders_main(pm::_PM.AbstractPowerModel)
for n in nw_ids(pm; hour=1, scenario=1)
variable_ne_branch_indicator(pm; nw = n, relax=true)
variable_ne_branch_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true)
variable_ne_branchdc_investment(pm; nw = n)
variable_ne_converter_indicator(pm; nw = n, relax=true)
variable_ne_converter_investment(pm; nw = n)
variable_storage_indicator(pm; nw = n, relax=true)
variable_storage_investment(pm; nw = n)
variable_flexible_demand_indicator(pm; nw = n, relax=true)
variable_flexible_demand_investment(pm; nw = n)
end
objective_stoch_flex(pm; investment=true, operation=false)
for n in nw_ids(pm; hour=1, scenario=1)
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
"Main problem model in Benders decomposition, for distribution networks."
function build_stoch_flex_tnep_benders_main(pm::_PM.AbstractBFModel)
for n in nw_ids(pm; hour=1, scenario=1)
variable_ne_branch_indicator(pm; nw = n, relax=true)
variable_ne_branch_investment(pm; nw = n)
variable_storage_indicator(pm; nw = n, relax=true)
variable_storage_investment(pm; nw = n)
variable_flexible_demand_indicator(pm; nw = n, relax=true)
variable_flexible_demand_investment(pm; nw = n)
end
objective_stoch_flex(pm; investment=true, operation=false)
for n in nw_ids(pm; hour=1, scenario=1)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n) # Needed to avoid infeasibility in secondary problems
end
end
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :flex_load; nw = n)
constraint_flexible_demand_activation(pm, i, prev_nws, n)
end
end
end
"Secondary problem model in Benders decomposition - suitable for both transmission and distribution."
function build_stoch_flex_tnep_benders_secondary(pm::_PM.AbstractPowerModel)
build_stoch_flex_tnep(pm; investment=false)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 16324 | "TNEP with storage, for transmission networks"
function strg_tnep(data::Dict{String,Any}, model_type::Type, optimizer; kwargs...)
require_dim(data, :hour, :year)
return _PM.solve_model(
data, model_type, optimizer, build_strg_tnep;
ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"TNEP with storage, for distribution networks"
function strg_tnep(data::Dict{String,Any}, model_type::Type{<:_PM.AbstractBFModel}, optimizer; kwargs...)
require_dim(data, :hour, :year)
return _PM.solve_model(
data, model_type, optimizer, build_strg_tnep;
ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!],
solution_processors = [_PM.sol_data_model!],
multinetwork = true,
kwargs...
)
end
"TNEP with storage, combines transmission and distribution networks"
function strg_tnep(t_data::Dict{String,Any}, d_data::Vector{Dict{String,Any}}, t_model_type::Type{<:_PM.AbstractPowerModel}, d_model_type::Type{<:_PM.AbstractPowerModel}, optimizer; kwargs...)
require_dim(t_data, :hour, :year)
for data in d_data
require_dim(data, :hour, :year)
end
return solve_model(
t_data, d_data, t_model_type, d_model_type, optimizer, build_strg_tnep;
t_ref_extensions = [_PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!],
d_ref_extensions = [_PM.ref_add_on_off_va_bounds!, ref_add_ne_branch_allbranches!, ref_add_frb_branch!, ref_add_oltc_branch!, ref_add_gen!, ref_add_storage!, ref_add_ne_storage!],
t_solution_processors = [_PM.sol_data_model!],
d_solution_processors = [_PM.sol_data_model!, sol_td_coupling!],
kwargs...
)
end
# Here the problem is defined, which is then sent to the solver.
# It is basically a declaration of variables and constraints of the problem
"Builds transmission model."
function build_strg_tnep(pm::_PM.AbstractPowerModel; objective::Bool=true)
# VARIABLES: defined within PowerModels(ACDC) can directly be used, other variables need to be defined in the according sections of the code
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
# DC bus
_PMACDC.variable_dcgrid_voltage_magnitude(pm; nw = n)
# DC branch
_PMACDC.variable_active_dcbranch_flow(pm; nw = n)
_PMACDC.variable_dcbranch_current(pm; nw = n)
# AC-DC converter
_PMACDC.variable_dc_converter(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n)
_PM.variable_ne_branch_voltage(pm; nw = n)
# Candidate DC bus
_PMACDC.variable_dcgrid_voltage_magnitude_ne(pm; nw = n)
# Candidate DC branch
variable_ne_branchdc_investment(pm; nw = n)
variable_ne_branchdc_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PMACDC.variable_branch_ne().
_PMACDC.variable_active_dcbranch_flow_ne(pm; nw = n)
_PMACDC.variable_dcbranch_current_ne(pm; nw = n)
# Candidate AC-DC converter
variable_dc_converter_ne(pm; nw = n) # FlexPlan version: replaces _PMACDC.variable_dc_converter_ne().
_PMACDC.variable_voltage_slack(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n)
variable_absorbed_energy_ne(pm; nw = n)
end
# OBJECTIVE: see objective.jl
if objective
objective_min_cost_storage(pm)
end
# CONSTRAINTS: defined within PowerModels(ACDC) can directly be used, other constraints need to be defined in the according sections of the code
for n in nw_ids(pm)
_PM.constraint_model_voltage(pm; nw = n)
_PM.constraint_ne_model_voltage(pm; nw = n)
_PMACDC.constraint_voltage_dc(pm; nw = n)
_PMACDC.constraint_voltage_dc_ne(pm; nw = n)
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_dcne_strg(pm, i; nw = n)
end
if haskey(pm.setting, "allow_line_replacement") && pm.setting["allow_line_replacement"] == true
for i in _PM.ids(pm, n, :branch)
constraint_ohms_yt_from_repl(pm, i; nw = n)
constraint_ohms_yt_to_repl(pm, i; nw = n)
constraint_voltage_angle_difference_repl(pm, i; nw = n)
constraint_thermal_limit_from_repl(pm, i; nw = n)
constraint_thermal_limit_to_repl(pm, i; nw = n)
end
else
for i in _PM.ids(pm, n, :branch)
_PM.constraint_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ohms_yt_to(pm, i; nw = n)
_PM.constraint_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_thermal_limit_from(pm, i; nw = n)
_PM.constraint_thermal_limit_to(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :ne_branch)
_PM.constraint_ne_ohms_yt_from(pm, i; nw = n)
_PM.constraint_ne_ohms_yt_to(pm, i; nw = n)
_PM.constraint_ne_voltage_angle_difference(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_from(pm, i; nw = n)
_PM.constraint_ne_thermal_limit_to(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc)
_PMACDC.constraint_power_balance_dc_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :busdc_ne)
_PMACDC.constraint_power_balance_dcne_dcne(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc)
_PMACDC.constraint_ohms_dc_branch(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branchdc_ne)
_PMACDC.constraint_ohms_dc_branch_ne(pm, i; nw = n)
_PMACDC.constraint_branch_limit_on_off(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :convdc)
_PMACDC.constraint_converter_losses(pm, i; nw = n)
_PMACDC.constraint_converter_current(pm, i; nw = n)
_PMACDC.constraint_conv_transformer(pm, i; nw = n)
_PMACDC.constraint_conv_reactor(pm, i; nw = n)
_PMACDC.constraint_conv_filter(pm, i; nw = n)
if _PM.ref(pm,n,:convdc,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle(pm, i; nw = n)
end
end
for i in _PM.ids(pm, n, :convdc_ne)
_PMACDC.constraint_converter_losses_ne(pm, i; nw = n)
_PMACDC.constraint_converter_current_ne(pm, i; nw = n)
_PMACDC.constraint_converter_limit_on_off(pm, i; nw = n)
_PMACDC.constraint_conv_transformer_ne(pm, i; nw = n)
_PMACDC.constraint_conv_reactor_ne(pm, i; nw = n)
_PMACDC.constraint_conv_filter_ne(pm, i; nw = n)
if _PM.ref(pm,n,:convdc_ne,i,"islcc") == 1
_PMACDC.constraint_conv_firing_angle_ne(pm, i; nw = n)
end
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
end
# Constraints on investments
if is_first_id(pm, n, :hour)
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :branchdc_ne; nw = n)
constraint_ne_branchdc_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :convdc_ne; nw = n)
constraint_ne_converter_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds distribution model."
function build_strg_tnep(pm::_PM.AbstractBFModel; objective::Bool=true, intertemporal_constraints::Bool=true)
for n in nw_ids(pm)
# AC Bus
_PM.variable_bus_voltage(pm; nw = n)
# AC branch
_PM.variable_branch_power(pm; nw = n)
_PM.variable_branch_current(pm; nw = n)
variable_oltc_branch_transform(pm; nw = n)
# Generator
_PM.variable_gen_power(pm; nw = n)
expression_gen_curtailment(pm; nw = n)
# Storage
_PM.variable_storage_power(pm; nw = n)
variable_absorbed_energy(pm; nw = n)
# Candidate AC branch
variable_ne_branch_investment(pm; nw = n)
variable_ne_branch_indicator(pm; nw = n, relax=true) # FlexPlan version: replaces _PM.variable_ne_branch_indicator().
_PM.variable_ne_branch_power(pm; nw = n, bounded = false) # Bounds computed here would be too limiting in the case of ne_branches added in parallel
variable_ne_branch_current(pm; nw = n)
variable_oltc_ne_branch_transform(pm; nw = n)
# Candidate storage
variable_storage_power_ne(pm; nw = n)
variable_absorbed_energy_ne(pm; nw = n)
end
if objective
objective_min_cost_storage(pm)
end
for n in nw_ids(pm)
_PM.constraint_model_current(pm; nw = n)
constraint_ne_model_current(pm; nw = n)
if haskey(dim_prop(pm), :sub_nw)
constraint_td_coupling_power_reactive_bounds(pm, get(pm.setting, "qs_ratio_bound", 0.48); nw = n)
end
for i in _PM.ids(pm, n, :ref_buses)
_PM.constraint_theta_ref(pm, i, nw = n)
end
for i in _PM.ids(pm, n, :bus)
constraint_power_balance_acne_strg(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :branch)
constraint_dist_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, n, :ne_branch)
constraint_dist_ne_branch_tnep(pm, i; nw = n)
end
for i in _PM.ids(pm, :storage, nw=n)
constraint_storage_excl_slack(pm, i, nw = n)
_PM.constraint_storage_thermal_limit(pm, i, nw = n)
_PM.constraint_storage_losses(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw=n)
constraint_storage_excl_slack_ne(pm, i, nw = n)
constraint_storage_thermal_limit_ne(pm, i, nw = n)
constraint_storage_losses_ne(pm, i, nw = n)
constraint_storage_bounds_ne(pm, i, nw = n)
end
if intertemporal_constraints
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, nw = n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, nw = n)
end
else
if is_last_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state_final(pm, i, nw = n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_final_ne(pm, i, nw = n)
end
end
# From second hour to last hour
prev_n = prev_id(pm, n, :hour)
for i in _PM.ids(pm, :storage, nw = n)
constraint_storage_state(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :storage_bounded_absorption, nw = n)
constraint_maximum_absorption(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage, nw = n)
constraint_storage_state_ne(pm, i, prev_n, n)
end
for i in _PM.ids(pm, :ne_storage_bounded_absorption, nw = n)
constraint_maximum_absorption_ne(pm, i, prev_n, n)
end
end
end
# Constraints on investments
if is_first_id(pm, n, :hour)
for i in _PM.ids(pm, n, :branch)
if !isempty(ne_branch_ids(pm, i; nw = n))
constraint_branch_complementarity(pm, i; nw = n)
end
end
# Inter-year constraints
prev_nws = prev_ids(pm, n, :year)
for i in _PM.ids(pm, :ne_branch; nw = n)
constraint_ne_branch_activation(pm, i, prev_nws, n)
end
for i in _PM.ids(pm, :ne_storage; nw = n)
constraint_ne_storage_activation(pm, i, prev_nws, n)
end
end
end
end
"Builds combined transmission and distribution model."
function build_strg_tnep(t_pm::_PM.AbstractPowerModel, d_pm::_PM.AbstractBFModel)
# Transmission variables and constraints
build_strg_tnep(t_pm; objective = false)
# Distribution variables and constraints
build_strg_tnep(d_pm; objective = false)
# Variables related to the combined model
# (No new variables are needed here.)
# Constraints related to the combined model
constraint_td_coupling(t_pm, d_pm)
# Objective function of the combined model
objective_min_cost_storage(t_pm, d_pm)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 10016 | """
run_td_decoupling(t_data, d_data, t_model_type, d_model_type, t_optimizer, d_optimizer, build_method; <keyword_arguments>)
Solve the planning of a transmission and distribution (T&D) system by decoupling the grid levels.
The T&D decoupling procedure is aimed at reducing computation time with respect to the
combined T&D model by solving the transmission and distribution parts of the network
separately.
It consists of the following steps:
1. compute a surrogate model of distribution networks;
2. optimize planning of transmission network using surrogate distribution networks;
3. fix power exchanges between T&D and optimize planning of distribution networks.
The procedure introduces approximations, therefore the solution cost is higher than that of
the combined T&D model.
# Arguments
- `t_data::Dict{String,Any}`: data dictionary for transmission network.
- `d_data::Vector{Dict{String,Any}}`: vector of data dictionaries, one for each distribution
network. Each data dictionary must have a `t_bus` key indicating the transmission network
bus id to which the distribution network is to be connected.
- `t_model_type::Type{<:PowerModels.AbstractPowerModel}`.
- `d_model_type::Type{<:PowerModels.AbstractPowerModel}`.
- `t_optimizer::Union{JuMP.MOI.AbstractOptimizer,JuMP.MOI.OptimizerWithAttributes}`:
optimizer for transmission network. It has to solve a MILP problem and can exploit
multi-threading.
- `d_optimizer::Union{JuMP.MOI.AbstractOptimizer,JuMP.MOI.OptimizerWithAttributes}`:
optimizer for distribution networks. It has to solve 2 MILP and 4 LP problems per
distribution network; since multi-threading is used to run optimizations of different
distribution networks in parallel, it is better for this optimizer to be single-threaded.
- `build_method::Function`.
- `t_ref_extensions::Vector{<:Function} = Function[]`.
- `d_ref_extensions::Vector{<:Function} = Function[]`.
- `t_solution_processors::Vector{<:Function} = Function[]`.
- `d_solution_processors::Vector{<:Function} = Function[]`.
- `t_setting::Dict{String,<:Any} = Dict{String,Any}()`.
- `d_setting::Dict{String,<:Any} = Dict{String,Any}()`.
- `direct_model = false`: whether to construct JuMP models using `JuMP.direct_model()`
instead of `JuMP.Model()`. Note that `JuMP.direct_model` is only supported by some
solvers.
"""
function run_td_decoupling(
t_data::Dict{String,Any},
d_data::Vector{Dict{String,Any}},
t_model_type::Type{<:_PM.AbstractPowerModel},
d_model_type::Type{<:_PM.AbstractPowerModel},
t_optimizer::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
d_optimizer::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
build_method::Function;
t_ref_extensions::Vector{<:Function} = Function[],
d_ref_extensions::Vector{<:Function} = Function[],
t_solution_processors::Vector{<:Function} = Function[],
d_solution_processors::Vector{<:Function} = Function[],
t_setting::Dict{String,<:Any} = Dict{String,Any}(),
d_setting::Dict{String,<:Any} = Dict{String,Any}(),
direct_model = false,
)
start_time = time()
nw_id_set = Set(id for (id,nw) in t_data["nw"])
d_data = deepcopy(d_data)
number_of_distribution_networks = length(d_data)
# Data preparation and checks
for s in 1:number_of_distribution_networks
data = d_data[s]
d_gen_id = _FP._get_reference_gen(data)
_FP.add_dimension!(data, :sub_nw, Dict(1 => Dict{String,Any}("t_bus"=>data["t_bus"], "d_gen"=>d_gen_id)))
delete!(data, "t_bus")
# Check that transmission and distribution network ids are the same
if Set(id for (id,nw) in data["nw"]) ≠ nw_id_set
Memento.error(_LOGGER, "Networks in transmission and distribution data dictionaries must have the same IDs.")
end
# Warn if cost for energy exchanged between transmission and distribution network is zero.
for (n,nw) in data["nw"]
d_gen = nw["gen"]["$d_gen_id"]
if d_gen["ncost"] < 2 || d_gen["cost"][end-1] ≤ 0.0
Memento.warn(_LOGGER, "Nonpositive cost detected for energy exchanged between transmission and distribution network $s. This may result in excessive usage of storage devices.")
break
end
end
# Notify if any storage devices have zero self-discharge rate.
raise_warning = false
for n in _FP.nw_ids(data; hour=1, scenario=1)
for (st,storage) in get(data["nw"]["$n"], "storage", Dict())
if storage["self_discharge_rate"] == 0.0
raise_warning = true
break
end
end
for (st,storage) in get(data["nw"]["$n"], "ne_storage", Dict())
if storage["self_discharge_rate"] == 0.0
raise_warning = true
break
end
end
if raise_warning
Memento.notice(_LOGGER, "Zero self-discharge rate detected for a storage device in distribution network $s. The model may have multiple optimal solutions.")
break
end
end
end
surrogate_distribution = Vector{Dict{String,Any}}(undef, number_of_distribution_networks)
surrogate_components = Vector{Dict{String,Any}}(undef, number_of_distribution_networks)
exchanged_power = Vector{Dict{String,Float64}}(undef, number_of_distribution_networks)
d_result = Vector{Dict{String,Any}}(undef, number_of_distribution_networks)
# Compute surrogate models of distribution networks and attach them to transmission network
start_time_surr = time()
Threads.@threads for s in 1:number_of_distribution_networks
Memento.trace(_LOGGER, "computing surrogate model $s of $number_of_distribution_networks...")
sol_up, sol_base, sol_down = probe_distribution_flexibility!(d_data[s]; model_type=d_model_type, optimizer=d_optimizer, build_method, ref_extensions=d_ref_extensions, solution_processors=d_solution_processors, setting=d_setting, direct_model)
surrogate_distribution[s] = calc_surrogate_model(d_data[s], sol_up, sol_base, sol_down)
end
for s in 1:number_of_distribution_networks
surrogate_components[s] = attach_surrogate_distribution!(t_data, surrogate_distribution[s])
end
Memento.debug(_LOGGER, "surrogate models of $number_of_distribution_networks distribution networks computed in $(round(time()-start_time_surr; sigdigits=3)) seconds")
# Compute planning of transmission network
start_time_t = time()
t_result = run_td_decoupling_model(t_data; model_type=t_model_type, optimizer=t_optimizer, build_method, ref_extensions=t_ref_extensions, solution_processors=t_solution_processors, setting=t_setting, return_solution=false, direct_model)
t_sol = t_result["solution"]
t_objective = calc_t_objective(t_result, t_data, surrogate_components)
for s in 1:number_of_distribution_networks
exchanged_power[s] = calc_exchanged_power(surrogate_components[s], t_sol)
remove_attached_distribution!(t_sol, t_data, surrogate_components[s])
end
Memento.debug(_LOGGER, "planning of transmission network computed in $(round(time()-start_time_t; sigdigits=3)) seconds")
# Compute planning of distribution networks
start_time_d = time()
Threads.@threads for s in 1:number_of_distribution_networks
Memento.trace(_LOGGER, "planning distribution network $s of $number_of_distribution_networks...")
apply_td_coupling_power_active_with_zero_cost!(d_data[s], t_data, exchanged_power[s])
d_result[s] = run_td_decoupling_model(d_data[s]; model_type=d_model_type, optimizer=d_optimizer, build_method, ref_extensions=d_ref_extensions, solution_processors=d_solution_processors, setting=d_setting, return_solution=false, direct_model)
end
d_objective = [d_res["objective"] for d_res in d_result]
Memento.debug(_LOGGER, "planning of $number_of_distribution_networks distribution networks computed in $(round(time()-start_time_d; sigdigits=3)) seconds")
result = Dict{String,Any}(
"t_solution" => t_sol,
"d_solution" => [d_res["solution"] for d_res in d_result],
"t_objective" => t_objective,
"d_objective" => d_objective,
"objective" => t_objective + sum(d_objective; init=0.0),
"solve_time" => time()-start_time
)
return result
end
"Run a model, ensure it is solved to optimality (error otherwise), return solution."
function run_td_decoupling_model(data::Dict{String,Any}; model_type::Type, optimizer, build_method::Function, ref_extensions, solution_processors, setting, relax_integrality=false, return_solution::Bool=true, direct_model=false, kwargs...)
start_time = time()
Memento.trace(_LOGGER, "┌ running $(String(nameof(build_method)))...")
if direct_model
result = _PM.solve_model(
data, model_type, nothing, build_method;
ref_extensions,
solution_processors,
multinetwork = true,
relax_integrality,
setting,
jump_model = JuMP.direct_model(optimizer),
kwargs...
)
else
result = _PM.solve_model(
data, model_type, optimizer, build_method;
ref_extensions,
solution_processors,
multinetwork = true,
relax_integrality,
setting,
kwargs...
)
end
Memento.trace(_LOGGER, "└ solved in $(round(time()-start_time;sigdigits=3)) seconds (of which $(round(result["solve_time"];sigdigits=3)) seconds for solver)")
if result["termination_status"] ∉ (OPTIMAL, LOCALLY_SOLVED)
Memento.error(_LOGGER, "Unable to solve $(String(nameof(build_method))) ($(result["optimizer"]) termination status: $(result["termination_status"]))")
end
return return_solution ? result["solution"] : result
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 916 | function apply_td_coupling_power_active_with_zero_cost!(d_data::Dict{String,Any}, t_data::Dict{String,Any}, exchanged_power::Dict{String,Float64})
for (n, d_nw) in d_data["nw"]
t_nw = t_data["nw"][n]
# Compute the active power exchanged between transmission and distribution in MVA base of distribution
p = exchanged_power[n] * (t_nw["baseMVA"]/d_nw["baseMVA"])
# Fix distribution generator power to the computed value
d_gen_id = _FP.dim_prop(d_data, parse(Int,n), :sub_nw, "d_gen")
d_gen = d_nw["gen"]["$d_gen_id"] = deepcopy(d_nw["gen"]["$d_gen_id"]) # Gen data is shared among nws originally.
d_gen["pmax"] = p
d_gen["pmin"] = p
# Set distribution generator cost to zero
d_gen["model"] = 2 # Cost model (2 => polynomial cost)
d_gen["ncost"] = 0 # Number of cost coefficients
d_gen["cost"] = Any[]
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 19745 | function probe_distribution_flexibility!(mn_data::Dict{String,Any}; model_type, optimizer, build_method, ref_extensions, solution_processors, setting=Dict{String,Any}(), direct_model=false)
_FP.require_dim(mn_data, :sub_nw)
if _FP.dim_length(mn_data, :sub_nw) > 1
Memento.error(_LOGGER, "A single distribution network is required ($(_FP.dim_length(mn_data, :sub_nw)) found)")
end
sol_base = run_td_decoupling_model(mn_data; model_type, optimizer, build_method, ref_extensions, solution_processors, setting, direct_model)
add_ne_branch_indicator!(mn_data, sol_base)
add_ne_storage_indicator!(mn_data, sol_base)
add_flex_load_indicator!(mn_data, sol_base)
mn_data_up = deepcopy(mn_data)
apply_gen_power_active_ub!(mn_data_up, sol_base)
add_storage_power_active_lb!(mn_data_up, sol_base)
add_ne_storage_power_active_lb!(mn_data_up, sol_base)
add_load_power_active_lb!(mn_data_up, sol_base)
add_load_flex_shift_up_lb!(mn_data_up, sol_base)
sol_up = run_td_decoupling_model(mn_data_up; model_type, optimizer, build_method=build_max_import_with_current_investments_monotonic(build_method), ref_extensions, solution_processors, setting, direct_model, relax_integrality=true)
apply_td_coupling_power_active!(mn_data_up, sol_up)
sol_up = run_td_decoupling_model(mn_data_up; model_type, optimizer, build_method=build_min_cost_at_max_import_monotonic(build_method), ref_extensions, solution_processors, setting, direct_model, relax_integrality=true)
mn_data_down = deepcopy(mn_data)
apply_gen_power_active_lb!(mn_data_down, sol_base)
add_storage_power_active_ub!(mn_data_down, sol_base)
add_ne_storage_power_active_ub!(mn_data_down, sol_base)
add_load_power_active_ub!(mn_data_down, sol_base)
add_load_flex_shift_down_lb!(mn_data_down, sol_base)
add_load_flex_red_lb!(mn_data_down, sol_base)
sol_down = run_td_decoupling_model(mn_data_down; model_type, optimizer, build_method=build_max_export_with_current_investments_monotonic(build_method), ref_extensions, solution_processors, setting, direct_model, relax_integrality=true)
apply_td_coupling_power_active!(mn_data_down, sol_down)
sol_down = run_td_decoupling_model(mn_data_down; model_type, optimizer, build_method=build_min_cost_at_max_export_monotonic(build_method), ref_extensions, solution_processors, setting, direct_model, relax_integrality=true)
return sol_up, sol_base, sol_down
end
## Result - data structure interaction
# These functions allow to pass investment decisions between two problems: the investment
# decision results of the first problem are copied into the data structure of the second
# problem; an appropriate constraint may be necessary in the model of the second problem to
# read the data prepared by these functions.
function _copy_comp_key!(target_data::Dict{String,Any}, comp::String, target_key::String, source_data::Dict{String,Any}, source_key::String=target_key)
for (n, target_nw) in target_data["nw"]
source_nw = source_data["nw"][n]
for (i, target_comp) in target_nw[comp]
target_comp[target_key] = source_nw[comp][i][source_key]
end
end
end
function add_ne_branch_indicator!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
# Cannot use `_copy_comp_key!` because `ne_branch`es have a `br_status` parameter:
# those whose `br_status` is 0 are not reported in solution dict.
for (n, data_nw) in mn_data["nw"]
sol_nw = solution["nw"][n]
for (b, data_branch) in data_nw["ne_branch"]
if data_branch["br_status"] == 1
data_branch["sol_built"] = sol_nw["ne_branch"][b]["built"]
end
end
end
end
function add_ne_storage_indicator!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "ne_storage", "sol_built", solution, "isbuilt")
end
function add_flex_load_indicator!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "sol_built", solution, "flex")
end
function add_load_power_active_ub!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "pflex_ub", solution, "pflex")
end
function add_load_power_active_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "pflex_lb", solution, "pflex")
end
function add_load_flex_shift_up_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "pshift_up_lb", solution, "pshift_up")
end
function add_load_flex_shift_down_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "pshift_down_lb", solution, "pshift_down")
end
function add_load_flex_red_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "load", "pred_lb", solution, "pred")
end
function apply_td_coupling_power_active!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
for (n, data_nw) in mn_data["nw"]
p = solution["nw"][n]["td_coupling"]["p"]
d_gen_id = _FP.dim_prop(mn_data, parse(Int,n), :sub_nw, "d_gen")
d_gen = data_nw["gen"]["$d_gen_id"] = deepcopy(data_nw["gen"]["$d_gen_id"]) # Gen data is shared among nws originally.
d_gen["pmax"] = p
d_gen["pmin"] = p
end
end
function apply_gen_power_active_ub!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
# Cannot use `_copy_comp_key!` because `d_gen` must not be changed.
for (n, data_nw) in mn_data["nw"]
d_gen_id = string(_FP.dim_prop(mn_data, parse(Int,n), :sub_nw, "d_gen"))
sol_nw = solution["nw"][n]
for (g, data_gen) in data_nw["gen"]
if g ≠ d_gen_id
ub = sol_nw["gen"][g]["pg"]
lb = data_gen["pmin"]
if ub < lb
Memento.trace(_LOGGER, @sprintf("Increasing by %.1e the upper bound on power of generator %s in nw %s to make it equal to existing lower bound (%f).", lb-ub, g, n, lb))
ub = lb
end
data_gen["pmax"] = ub
end
end
end
end
function apply_gen_power_active_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
# Cannot use `_copy_comp_key!` because `d_gen` must not be changed.
for (n, data_nw) in mn_data["nw"]
d_gen_id = string(_FP.dim_prop(mn_data, parse(Int,n), :sub_nw, "d_gen"))
sol_nw = solution["nw"][n]
for (g, data_gen) in data_nw["gen"]
if g ≠ d_gen_id
lb = sol_nw["gen"][g]["pg"]
ub = data_gen["pmax"]
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on power of generator %s in nw %s to make it equal to existing upper bound (%f).", lb-ub, g, n, ub))
lb = ub
end
data_gen["pmin"] = lb
end
end
end
end
function add_storage_power_active_ub!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "storage", "ps_ub", solution, "ps")
end
function add_storage_power_active_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "storage", "ps_lb", solution, "ps")
end
function add_ne_storage_power_active_ub!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "ne_storage", "ps_ne_ub", solution, "ps_ne")
end
function add_ne_storage_power_active_lb!(mn_data::Dict{String,Any}, solution::Dict{String,Any})
_copy_comp_key!(mn_data, "ne_storage", "ps_ne_lb", solution, "ps_ne")
end
## Problems
function build_max_import(build_method::Function)
function build_max_import(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
objective_max_import(pm)
end
end
function build_max_export(build_method::Function)
function build_max_export(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
objective_max_export(pm)
end
end
function build_max_import_with_current_investments(build_method::Function)
function build_max_import_with_current_investments(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
end
objective_max_import(pm)
end
end
function build_max_export_with_current_investments(build_method::Function)
function build_max_export_with_current_investments(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
end
objective_max_export(pm)
end
end
function build_max_import_with_current_investments_monotonic(build_method::Function)
function build_max_import_with_current_investments_monotonic(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
# gen_power_active_ub already applied in data
constraint_storage_power_active_lb(pm, n)
constraint_ne_storage_power_active_lb(pm, n)
constraint_load_power_active_lb(pm, n)
constraint_load_flex_shift_up_lb(pm, n)
end
objective_max_import(pm)
end
end
function build_max_export_with_current_investments_monotonic(build_method::Function)
function build_max_export_with_current_investments_monotonic(pm::_PM.AbstractBFModel)
build_method(pm; objective = false, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
# gen_power_active_lb already applied in data
constraint_storage_power_active_ub(pm, n)
constraint_ne_storage_power_active_ub(pm, n)
constraint_load_power_active_ub(pm, n)
constraint_load_flex_shift_down_lb(pm, n)
constraint_load_flex_red_lb(pm, n)
end
objective_max_export(pm)
end
end
function build_min_cost_at_max_import_monotonic(build_method::Function)
function build_min_cost_at_max_import_monotonic(pm::_PM.AbstractBFModel)
build_method(pm; objective = true, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
# td_coupling_power_active already fixed in data
# gen_power_active_ub already applied in data
constraint_storage_power_active_lb(pm, n)
constraint_ne_storage_power_active_lb(pm, n)
constraint_load_power_active_lb(pm, n)
constraint_load_flex_shift_up_lb(pm, n)
end
end
end
function build_min_cost_at_max_export_monotonic(build_method::Function)
function build_min_cost_at_max_export_monotonic(pm::_PM.AbstractBFModel)
build_method(pm; objective = true, intertemporal_constraints = false)
for n in _PM.nw_ids(pm)
constraint_ne_branch_indicator_fix(pm, n)
constraint_ne_storage_indicator_fix(pm, n)
constraint_flex_load_indicator_fix(pm, n)
# td_coupling_power_active already fixed in data
# gen_power_active_lb already applied in data
constraint_storage_power_active_ub(pm, n)
constraint_ne_storage_power_active_ub(pm, n)
constraint_load_power_active_ub(pm, n)
constraint_load_flex_shift_down_lb(pm, n)
constraint_load_flex_red_lb(pm, n)
end
end
end
## Constraints
"Fix investment decisions on candidate branches according to values in data structure"
function constraint_ne_branch_indicator_fix(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :ne_branch)
indicator = _PM.var(pm, n, :branch_ne, i)
value = _PM.ref(pm, n, :ne_branch, i, "sol_built")
JuMP.@constraint(pm.model, indicator == value)
end
end
"Fix investment decisions on candidate storage according to values in data structure"
function constraint_ne_storage_indicator_fix(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :ne_storage)
indicator = _PM.var(pm, n, :z_strg_ne, i)
value = _PM.ref(pm, n, :ne_storage, i, "sol_built")
JuMP.@constraint(pm.model, indicator == value)
end
end
"Fix investment decisions on flexibility of loads according to values in data structure"
function constraint_flex_load_indicator_fix(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :flex_load)
indicator = _PM.var(pm, n, :z_flex, i)
value = _PM.ref(pm, n, :flex_load, i, "sol_built")
JuMP.@constraint(pm.model, indicator == value)
end
end
"Put an upper bound on the active power absorbed by loads"
function constraint_load_power_active_ub(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :load)
pflex = _PM.var(pm, n, :pflex, i)
ub = _PM.ref(pm, n, :load, i, "pflex_ub")
lb = JuMP.lower_bound(pflex)
if ub < lb
Memento.trace(_LOGGER, @sprintf("Increasing by %.1e the upper bound on absorbed power of load %i in nw %i to make it equal to existing lower bound (%f).", lb-ub, i, n, lb))
ub = lb
end
JuMP.set_upper_bound(pflex, ub)
end
end
"Put a lower bound on the active power absorbed by loads"
function constraint_load_power_active_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :load)
pflex = _PM.var(pm, n, :pflex, i)
lb = _PM.ref(pm, n, :load, i, "pflex_lb")
ub = JuMP.upper_bound(pflex)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on absorbed power of load %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(pflex, lb)
end
end
"Put a lower bound on upward shifted power of flexible loads"
function constraint_load_flex_shift_up_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :flex_load)
pshift_up = _PM.var(pm, n, :pshift_up, i)
lb = _PM.ref(pm, n, :load, i, "pshift_up_lb")
ub = JuMP.upper_bound(pshift_up)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on upward shifted power of load %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(pshift_up, lb)
end
end
"Put a lower bound on downward shifted power of flexible loads"
function constraint_load_flex_shift_down_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :flex_load)
pshift_down = _PM.var(pm, n, :pshift_down, i)
lb = _PM.ref(pm, n, :load, i, "pshift_down_lb")
ub = JuMP.upper_bound(pshift_down)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on downward shifted power of load %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(pshift_down, lb)
end
end
"Put a lower bound on voluntarily reduced power of flexible loads"
function constraint_load_flex_red_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :flex_load)
pred = _PM.var(pm, n, :pred, i)
lb = _PM.ref(pm, n, :load, i, "pred_lb")
ub = JuMP.upper_bound(pred)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on volutarily reduced power of load %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(pred, lb)
end
end
"Put an upper bound on the active power exchanged by storage (load convention)"
function constraint_storage_power_active_ub(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :storage)
ps = _PM.var(pm, n, :ps, i)
ub = _PM.ref(pm, n, :storage, i, "ps_ub")
lb = JuMP.lower_bound(ps)
if ub < lb
Memento.trace(_LOGGER, @sprintf("Increasing by %.1e the upper bound on power of storage %i in nw %i to make it equal to existing lower bound (%f).", lb-ub, i, n, lb))
ub = lb
end
JuMP.set_upper_bound(ps, ub)
end
end
"Put a lower bound on the active power exchanged by storage (load convention)"
function constraint_storage_power_active_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :storage)
ps = _PM.var(pm, n, :ps, i)
lb = _PM.ref(pm, n, :storage, i, "ps_lb")
ub = JuMP.upper_bound(ps)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on power of storage %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(ps, lb)
end
end
"Put an upper bound on the active power exchanged by candidate storage (load convention)"
function constraint_ne_storage_power_active_ub(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :ne_storage)
ps = _PM.var(pm, n, :ps_ne, i)
ub = _PM.ref(pm, n, :ne_storage, i, "ps_ne_ub")
lb = JuMP.lower_bound(ps)
if ub < lb
Memento.trace(_LOGGER, @sprintf("Increasing by %.1e the upper bound on power of candidate storage %i in nw %i to make it equal to existing lower bound (%f).", lb-ub, i, n, lb))
ub = lb
end
JuMP.set_upper_bound(ps, ub)
end
end
"Put a lower bound on the active power exchanged by candidate storage (load convention)"
function constraint_ne_storage_power_active_lb(pm::_PM.AbstractPowerModel, n::Int)
for i in _PM.ids(pm, n, :ne_storage)
ps = _PM.var(pm, n, :ps_ne, i)
lb = _PM.ref(pm, n, :ne_storage, i, "ps_ne_lb")
ub = JuMP.upper_bound(ps)
if lb > ub
Memento.trace(_LOGGER, @sprintf("Decreasing by %.1e the lower bound on power of candidate storage %i in nw %i to make it equal to existing upper bound (%f).", lb-ub, i, n, ub))
lb = ub
end
JuMP.set_lower_bound(ps, lb)
end
end
## Objectives
function objective_max_import(pm::_PM.AbstractPowerModel)
# There is no need to distinguish between scenarios because they are independent.
return JuMP.@objective(pm.model, Max,
sum( calc_td_coupling_power_active(pm, n) for (n, nw_ref) in _PM.nws(pm) )
)
end
function objective_max_export(pm::_PM.AbstractPowerModel)
# There is no need to distinguish between scenarios because they are independent.
return JuMP.@objective(pm.model, Min,
sum( calc_td_coupling_power_active(pm, n) for (n, nw_ref) in _PM.nws(pm) )
)
end
function calc_td_coupling_power_active(pm::_PM.AbstractPowerModel, n::Int)
pcc_gen = _FP.dim_prop(pm, n, :sub_nw, "d_gen")
p = _PM.var(pm, n, :pg, pcc_gen)
return p
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 10581 | function calc_surrogate_model(orig_data::Dict{String,Any}, sol_up::Dict{String,Any}, sol_base::Dict{String,Any}, sol_down::Dict{String,Any}; standalone::Bool=false)
nws = string.(_FP.nw_ids(orig_data))
first_nw = first(nws)
sol_up = sol_up["nw"]
sol_base = sol_base["nw"]
sol_down = sol_down["nw"]
data = Dict{String,Any}(
"dim" => deepcopy(orig_data["dim"]),
"multinetwork" => true,
"nw" => Dict{String,Any}(),
"per_unit" => true,
)
if standalone
_FP.dim_prop(data, :sub_nw, 1)["d_gen"] = 2 # Slack generator representing transmission nw.
end
template_nw = Dict{String,Any}(
"baseMVA" => orig_data["nw"][first_nw]["baseMVA"],
"time_elapsed" => orig_data["nw"][first_nw]["time_elapsed"],
"gen" => Dict{String,Any}(),
"load" => Dict{String,Any}(),
"storage" => Dict{String,Any}(),
)
if standalone
template_nw["branch"] = Dict{String,Any}(
"1" => Dict{String,Any}(
"angmax" => 0.0,
"angmin" => 0.0,
"b_fr" => 0.0,
"b_to" => 0.0,
"br_r" => 0.0,
"br_x" => 0.0,
"br_status" => 1,
"f_bus" => 1,
"g_fr" => 0.0,
"g_to" => 0.0,
"index" => 1,
"rate_a" => 0.0,
"t_bus" => 2,
"tap" => 1.0,
"transformer" => false,
),
)
template_nw["bus"] = Dict{String,Any}(
"1" => Dict{String,Any}(
"bus_type" => 3,
"index" => 1,
"va" => 0.0,
"vmax" => 1.0,
"vmin" => 1.0,
),
"2" => Dict{String,Any}(
"bus_type" => 1,
"index" => 2,
"va" => 0.0,
"vmax" => 1.0,
"vmin" => 1.0,
),
)
template_nw["dcline"] = Dict{String,Any}()
template_nw["ne_branch"] = Dict{String,Any}()
template_nw["ne_storage"] = Dict{String,Any}()
template_nw["shunt"] = Dict{String,Any}()
template_nw["switch"] = Dict{String,Any}()
end
template_gen = surrogate_gen_const(; standalone)
template_storage = surrogate_storage_const(orig_data["nw"][first_nw], sol_base[first_nw]; standalone)
template_load = surrogate_load_const(sol_base[first_nw])
for n in nws
nw = data["nw"][n] = deepcopy(template_nw)
nw["storage"]["1"] = surrogate_storage_ts(template_storage, orig_data["nw"][n], sol_up[n], sol_base[n], sol_down[n]; standalone)
nw["load"]["1"] = surrogate_load_ts(template_load, orig_data["nw"][n], nw["storage"]["1"], sol_up[n], sol_base[n], sol_down[n])
nw["gen"]["1"] = surrogate_gen_ts(template_gen, orig_data["nw"][n], nw["load"]["1"], sol_base[n])
if standalone
orig_d_gen = _FP.dim_prop(orig_data, :sub_nw, 1, "d_gen")
nw["gen"]["2"] = deepcopy(orig_data["nw"][n]["gen"]["$orig_d_gen"])
nw["gen"]["2"]["gen_bus"] = 1
nw["gen"]["2"]["source_id"] = Vector{String}()
end
end
add_singular_data!(data, orig_data, sol_base)
return data
end
function surrogate_load_const(bs)
load = Dict{String,Any}(
"load_bus" => 1,
"status" => 1,
)
if any(ld -> ld.second["flex"]>0.5, bs["load"])
load["flex"] = true
load["lifetime"] = 1
load["cost_inv"] = 0.0
else
load["flex"] = false
end
return load
end
function surrogate_gen_const(; standalone)
gen = Dict{String,Any}(
"cost" => [0.0, 0.0],
"dispatchable" => false,
"gen_bus" => 1,
"gen_status" => 1,
"model" => 2,
"ncost" => 2,
"pmin" => 0.0,
)
if standalone
gen["qmax"] = 0.0
gen["qmin"] = 0.0
end
return gen
end
function surrogate_storage_const(od, bs; standalone)
storage = Dict{String,Any}(
"status" => 1,
"storage_bus" => 1,
"energy_rating" => sum(s["energy_rating"] for s in values(od["storage"]); init=0.0) + sum(s["energy_rating"] for (i,s) in od["ne_storage"] if bs["ne_storage"][i]["isbuilt"] > 0.5; init=0.0),
"self_discharge_rate" => min(minimum(s["self_discharge_rate"] for s in values(od["storage"]); init=1.0), minimum(s["self_discharge_rate"] for (i,s) in od["ne_storage"] if bs["ne_storage"][i]["isbuilt"] > 0.5; init=1.0)),
"charge_efficiency" => max(maximum(s["charge_efficiency"] for s in values(od["storage"]); init=0.0), maximum(s["charge_efficiency"] for (i,s) in od["ne_storage"] if bs["ne_storage"][i]["isbuilt"] > 0.5; init=0.0)),
# When the distribution network does not have storage devices, surrogate model's
# storage energy rating is 0, so it can not be used in practice and the other
# parameters should not be relevant.
# However, a 0.0 discharge efficiency would cause Inf coefficients in energy
# constraints, which in turn would cause errors when instantiating the model.
# Therefore, it is better to initialize the discharge efficiency using a small
# positive value, such as 0.001.
"discharge_efficiency" => max(maximum(s["discharge_efficiency"] for s in values(od["storage"]); init=0.0), maximum(s["discharge_efficiency"] for (i,s) in od["ne_storage"] if bs["ne_storage"][i]["isbuilt"] > 0.5; init=0.001)),
)
if standalone
storage["p_loss"] = 0.0
storage["q_loss"] = 0.0
storage["qmax"] = 0.0
storage["qmin"] = 0.0
storage["r"] = 0.0
storage["x"] = 0.0
end
return storage
end
function surrogate_load_ts(load, od, storage, up, bs, dn)
pshift_up_max = min(sum(l["pshift_up"] for l in values(up["load"]); init=0.0)-sum(l["pshift_up"] for l in values(bs["load"]); init=0.0), up["td_coupling"]["p"]-storage["charge_rating"])
pshift_down_max = sum(l["pshift_down"] for l in values(dn["load"]); init=0.0)-sum(l["pshift_down"] for l in values(bs["load"]); init=0.0)
pred_max = sum(l["pred"] for l in values(dn["load"]); init=0.0)-sum(l["pred"] for l in values(bs["load"]); init=0.0)
pd = min(up["td_coupling"]["p"]-storage["charge_rating"]-pshift_up_max, bs["td_coupling"]["p"]-dn["td_coupling"]["p"]-storage["discharge_rating"])
load = copy(load)
load["pd"] = max(pd, 0.0)
load["pshift_up_rel_max"] = pd>0 ? pshift_up_max/pd : 0.0
load["pshift_down_rel_max"] = pd>0 ? pshift_down_max/pd : 0.0
load["pred_rel_max"] = pd>0 ? pred_max/pd : 0.0
load["cost_curt"] = minimum(ld["cost_curt"] for ld in values(od["load"]))
if load["flex"]
load["cost_red"] = minimum(od["load"][l]["cost_red"] for (l,ld) in bs["load"] if ld["flex"]>0.5)
load["cost_shift"] = minimum(od["load"][l]["cost_shift"] for (l,ld) in bs["load"] if ld["flex"]>0.5)
end
return load
end
function surrogate_gen_ts(gen, od, load, bs)
gen = copy(gen)
gen["pmax"] = load["pd"] - bs["td_coupling"]["p"]
# Assumption: all generators are non-dispatchable (except the generator that simulates the transmission network, which has already been removed from the solution dict).
gen["cost_curt"] = isempty(bs["gen"]) ? 0.0 : minimum(od["gen"][g]["cost_curt"] for (g,gen) in bs["gen"])
return gen
end
function surrogate_storage_ts(storage, od, up, bs, dn; standalone)
ps_up = sum(s["ps"] for s in values(get(up,"storage",Dict())); init=0.0) + sum(s["ps_ne"] for s in values(get(up,"ne_storage",Dict())) if s["isbuilt"] > 0.5; init=0.0)
ps_bs = sum(s["ps"] for s in values(get(bs,"storage",Dict())); init=0.0) + sum(s["ps_ne"] for s in values(get(bs,"ne_storage",Dict())) if s["isbuilt"] > 0.5; init=0.0)
ps_dn = sum(s["ps"] for s in values(get(dn,"storage",Dict())); init=0.0) + sum(s["ps_ne"] for s in values(get(dn,"ne_storage",Dict())) if s["isbuilt"] > 0.5; init=0.0)
ext_flow = (
sum(
od["storage"][i]["charge_efficiency"]*s["sc"]
- s["sd"]/od["storage"][i]["discharge_efficiency"]
+ od["storage"][i]["stationary_energy_inflow"]
- od["storage"][i]["stationary_energy_outflow"]
for (i,s) in get(bs,"storage",Dict());
init=0.0
)
+ sum(
od["ne_storage"][i]["charge_efficiency"]*s["sc_ne"]
- s["sd_ne"]/od["ne_storage"][i]["discharge_efficiency"]
+ od["ne_storage"][i]["stationary_energy_inflow"]
- od["ne_storage"][i]["stationary_energy_outflow"]
for (i,s) in get(bs,"ne_storage",Dict()) if s["isbuilt"] > 0.5;
init=0.0
)
)
storage = copy(storage)
storage["charge_rating"] = min(ps_up - ps_bs, up["td_coupling"]["p"])
storage["discharge_rating"] = min(ps_bs - ps_dn, -dn["td_coupling"]["p"])
storage["stationary_energy_inflow"] = max.(ext_flow, 0.0)
storage["stationary_energy_outflow"] = -min.(ext_flow, 0.0)
storage["thermal_rating"] = 2 * max(storage["charge_rating"], storage["discharge_rating"]) # To avoid that thermal rating limits active power, even in the case of octagonal approximation of apparent power.
storage["p_loss"] = 0.0
storage["q_loss"] = 0.0
storage["r"] = 0.0
storage["x"] = 0.0
return storage
end
function add_singular_data!(data, orig_data, sol_base)
# Storage initial energy
for n in _FP.nw_ids(orig_data; hour=1)
d = data["nw"]["$n"]
od = orig_data["nw"]["$n"]
bs = sol_base["$n"]
d["storage"]["1"]["energy"] = sum(st["energy"] for st in values(get(od,"storage",Dict())); init=0.0) + sum(od["ne_storage"][s]["energy"] for (s,st) in get(bs,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
end
# Storage final energy
for n in _FP.nw_ids(orig_data; hour=_FP.dim_length(orig_data,:hour))
d = data["nw"]["$n"]
od = orig_data["nw"]["$n"]
bs = sol_base["$n"]
d["storage"]["1"]["energy"] = sum(st["energy"] for st in values(get(od,"storage",Dict())); init=0.0) + sum(od["ne_storage"][s]["energy"] for (s,st) in get(bs,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 299 | module TDDecoupling
export run_td_decoupling
using ..FlexPlan
const _FP = FlexPlan
import ..FlexPlan: _PM, _LOGGER
import JuMP
import Memento
using Printf
include("base.jl")
include("probe_flexibility.jl")
include("surrogate_model.jl")
include("transmission.jl")
include("distribution.jl")
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3787 | # Attach the surrogate model components to the transmission network bus to which the distribution network is connected
function attach_surrogate_distribution!(t_data::Dict{String,Any}, surr_dist::Dict{String,Any})
if !haskey(_FP.dim_prop(surr_dist, :sub_nw, 1), "t_bus")
Memento.error(_LOGGER, "Surrogate model of distribution network does not specify the AC bus to attach to.")
end
t_bus = _FP.dim_prop(surr_dist, :sub_nw, 1, "t_bus")
if _FP.dim_length(surr_dist) ≠ _FP.dim_length(t_data)
Memento.error(_LOGGER, "Surrogate model to attach to bus $t_bus has $(_FP.dim_length(surr_dist)) networks instead of $(_FP.dim_length(t_data)).")
end
surrogate_components = Dict{String,Any}()
for (n,nw) in t_data["nw"]
surr_nw = surr_dist["nw"][n]
comp_id = surrogate_components[n] = Dict{String,String}()
_FP.convert_mva_base!(surr_nw, nw["baseMVA"])
g = comp_id["gen"] = string(length(nw["gen"]) + 1)
gen = surr_nw["gen"]["1"]
gen["gen_bus"] = t_bus
nw["gen"][g] = gen
s = comp_id["storage"] = string(length(nw["storage"]) + 1)
st = surr_nw["storage"]["1"]
st["storage_bus"] = t_bus
nw["storage"][s] = st
l = comp_id["load"] = string(length(nw["load"]) + 1)
load = surr_nw["load"]["1"]
load["load_bus"] = t_bus
nw["load"][l] = load
end
return surrogate_components
end
# Compute the cost of the transmission network, excluding cost related to surrogate model components
function calc_t_objective(t_result::Dict{String,Any}, t_data::Dict{String,Any}, surrogate_components::Vector{Dict{String,Any}})
nw_raw_cost = Dict{String,Float64}()
for (n,data_nw) in t_data["nw"]
nw_raw_cost[n] = 0.0
sol_nw = t_result["solution"]["nw"][n]
data_gen = data_nw["gen"]
sol_gen = sol_nw["gen"]
data_load = data_nw["load"]
sol_load = sol_nw["load"]
for surr_dist in surrogate_components
g = surr_dist[n]["gen"]
l = surr_dist[n]["load"]
nw_raw_cost[n] += (
data_gen[g]["cost_curt"] * sol_gen[g]["pgcurt"]
+ data_load[l]["cost_curt"] * sol_load[l]["pcurt"]
+ get(data_load[l],"cost_red",0.0) * sol_load[l]["pred"]
+ get(data_load[l],"cost_shift",0.0) * 0.5*(sol_load[l]["pshift_up"]+sol_load[l]["pshift_down"])
)
end
end
distribution_cost = sum(scenario["probability"] * sum(nw_raw_cost[n] for n in string.(_FP.nw_ids(t_data; scenario=s))) for (s, scenario) in _FP.dim_prop(t_data, :scenario))
transmission_cost = t_result["objective"] - distribution_cost
return transmission_cost
end
# Compute the active power exchanged between transmission and distribution, using MVA base of transmission
function calc_exchanged_power(surrogate_components::Dict{String,Any}, t_sol::Dict{String,Any})
exchanged_power = Dict{String,Float64}()
for (n, sc) in surrogate_components
t_nw = t_sol["nw"][n]
exchanged_power[n] = -t_nw["gen"][sc["gen"]]["pg"] + t_nw["storage"][sc["storage"]]["ps"] + t_nw["load"][sc["load"]]["pflex"]
end
return exchanged_power
end
function remove_attached_distribution!(t_sol::Dict{String,Any}, t_data::Dict{String,Any}, surrogate_components::Dict{String,Any})
for (n,sol_nw) in t_sol["nw"]
data_nw = t_data["nw"][n]
comp_id = surrogate_components[n]
delete!(sol_nw["gen"], comp_id["gen"])
delete!(data_nw["gen"], comp_id["gen"])
delete!(sol_nw["storage"], comp_id["storage"])
delete!(data_nw["storage"], comp_id["storage"])
delete!(sol_nw["load"], comp_id["load"])
delete!(data_nw["load"], comp_id["load"])
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 5018 | @testset "BFARadPowerModel" begin
@testset "CIGRE TNEP single-period" begin
data = _FP.parse_file(normpath(@__DIR__,"../test/data/cigre_mv_eu/cigre_mv_eu_unit_test.m"))
_FP.add_dimension!(data, :hour, 1)
_FP.add_dimension!(data, :year, 1)
data = _FP.make_multinetwork(data)
result = _FP.flex_tnep(data, _FP.BFARadPowerModel, milp_optimizer)
sol = result["solution"]["nw"]["1"]
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 4360.45 rtol=1e-3
@test sol["branch"]["16"]["pf"] ≈ -sol["branch"]["16"]["pt"] rtol=1e-3 # Zero active power losses in OLTC branch
@test sol["branch"]["16"]["qf"] ≈ -sol["branch"]["16"]["qt"] rtol=1e-3 # Zero reactive power losses in OLTC branch
@test sol["branch"]["17"]["pf"] ≈ -sol["branch"]["17"]["pt"] rtol=1e-3 # Zero active power losses in frb branch
@test sol["branch"]["17"]["qf"] ≈ -sol["branch"]["17"]["qt"] rtol=1e-3 # Zero reactive power losses in frb branch
@test sol["branch"]["1"]["pf"] ≈ -sol["branch"]["1"]["pt"] rtol=1e-3 # Zero active power losses in regular branch
@test sol["branch"]["1"]["qf"] ≈ -sol["branch"]["1"]["qt"] rtol=1e-3 # Zero reactive power losses in regular branch
@test sol["ne_branch"]["1"]["built"] ≈ 0.0 atol=1e-1 # Unused OLTC ne_branch
@test sol["ne_branch"]["2"]["built"] ≈ 0.0 atol=1e-1 # Unused frb ne_branch
@test sol["ne_branch"]["3"]["built"] ≈ 0.0 atol=1e-1 # Unused regular ne_branch
@test sol["ne_branch"]["1"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in unused OLTC ne_branch
@test sol["ne_branch"]["1"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in unused OLTC ne_branch
@test sol["ne_branch"]["2"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in unused frb ne_branch
@test sol["ne_branch"]["2"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in unused frb ne_branch
@test sol["ne_branch"]["3"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in unused regular ne_branch
@test sol["ne_branch"]["3"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in unused regular ne_branch
@test sum(g["pg"] for g in values(sol["gen"])) ≈ sum(l["pflex"] for l in values(sol["load"])) rtol=1e-3 # Zero overall active power losses
@test sum(g["qg"] for g in values(sol["gen"])) ≈ sum(l["qflex"] for l in values(sol["load"])) rtol=1e-3 # Zero overall reactive power losses
data["nw"]["1"]["load"]["1"]["pd"] += 10.0 # Bus 1. Changes reactive power demand too, via `pf_angle`.
data["nw"]["1"]["load"]["12"]["pd"] += 4.0 # Bus 13. Changes reactive power demand too, via `pf_angle`.
data["nw"]["1"]["branch"]["12"]["rate_a"] = data["nw"]["1"]["branch"]["12"]["rate_b"] = data["nw"]["1"]["branch"]["12"]["rate_c"] = 0.0
result = _FP.flex_tnep(data, _FP.BFARadPowerModel, milp_optimizer)
sol = result["solution"]["nw"]["1"]
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 5764.48 rtol=1e-3
@test sol["ne_branch"]["1"]["built"] ≈ 1.0 atol=1e-1 # Replacement OLTC ne_branch
@test sol["ne_branch"]["2"]["built"] ≈ 1.0 atol=1e-1 # frb ne_branch added in parallel
@test sol["ne_branch"]["3"]["built"] ≈ 1.0 atol=1e-1 # Replacement regular ne_branch
@test sol["ne_branch"]["4"]["built"] ≈ 0.0 atol=1e-1 # Unused ne_branch
@test sol["ne_branch"]["1"]["pf"] ≈ -sol["ne_branch"]["1"]["pt"] rtol=1e-3 # Zero active power losses in OLTC ne_branch
@test sol["ne_branch"]["1"]["qf"] ≈ -sol["ne_branch"]["1"]["qt"] rtol=1e-3 # Zero reactive power losses in OLTC ne_branch
@test sol["ne_branch"]["2"]["pf"] ≈ -sol["ne_branch"]["2"]["pt"] rtol=1e-3 # Zero active power losses in frb ne_branch
@test sol["ne_branch"]["2"]["qf"] ≈ -sol["ne_branch"]["2"]["qt"] rtol=1e-3 # Zero reactive power losses in frb ne_branch
@test sol["ne_branch"]["3"]["pf"] ≈ -sol["ne_branch"]["3"]["pt"] rtol=1e-3 # Zero active power losses in regular ne_branch
@test sol["ne_branch"]["3"]["qf"] ≈ -sol["ne_branch"]["3"]["qt"] rtol=1e-3 # Zero reactive power losses in regular ne_branch
@test sol["branch"]["16"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in replaced OLTC branch
@test sol["branch"]["16"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in replaced OLTC branch
@test sol["branch"]["17"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in replaced frb branch
@test sol["branch"]["17"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in replaced frb branch
@test sol["branch"]["13"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in replaced regular branch
@test sol["branch"]["13"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in replaced regular branch
@test sol["branch"]["12"]["pf"] ≈ 0.0 atol=1e-2 # Zero active power in branch having zero rating
@test sol["branch"]["12"]["qf"] ≈ 0.0 atol=1e-2 # Zero reactive power in branch having zero rating
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 16392 | @testset "Multinetwork dimensions" begin
# Some tests here intentionally throw errors, so we temporarily raise the FlexPlan
# logger level to prevent them from being displayed.
previous_FlexPlan_logger_level = Memento.getlevel(Memento.getlogger(_FP))
Memento.setlevel!(Memento.getlogger(_FP), "alert")
#=
julia> DataFrames.DataFrame(
nw = 1:24,
hour = repeat(1:4; outer=6),
scenario = repeat(1:3; inner=4, outer=2),
sub_nw = repeat(1:2; inner=12)
)
24×4 DataFrame
Row │ nw hour scenario sub_nw
│ Int64 Int64 Int64 Int64
─────┼────────────────────────────────
1 │ 1 1 1 1
2 │ 2 2 1 1
3 │ 3 3 1 1
4 │ 4 4 1 1
5 │ 5 1 2 1
6 │ 6 2 2 1
7 │ 7 3 2 1
8 │ 8 4 2 1
9 │ 9 1 3 1
10 │ 10 2 3 1
11 │ 11 3 3 1
12 │ 12 4 3 1
13 │ 13 1 1 2
14 │ 14 2 1 2
15 │ 15 3 1 2
16 │ 16 4 1 2
17 │ 17 1 2 2
18 │ 18 2 2 2
19 │ 19 3 2 2
20 │ 20 4 2 2
21 │ 21 1 3 2
22 │ 22 2 3 2
23 │ 23 3 3 2
24 │ 24 4 3 2
=#
sn_data = Dict{String,Any}(c=>Dict{String,Any}() for c in ("bus","branch","dcline","gen","load","shunt","switch","storage")) # Fake a single-network data structure
_FP.add_dimension!(sn_data, :hour, 4)
_FP.add_dimension!(sn_data, :scenario, Dict(s => Dict{String,Any}("probability"=>s/6) for s in 1:3))
_FP.add_dimension!(sn_data, :sub_nw, 2; metadata = Dict{String,Any}("description"=>"sub_nws model different physical networks"))
dt = Dict{String,Any}("dim"=>_FP.dim(sn_data), "multinetwork"=>true, "nw"=>Dict{String,Any}("1"=>sn_data)) # Fake a multinetwork data structure
pm = _PM.instantiate_model(dt, _PM.ACPPowerModel, pm->nothing)
dim = _FP.dim(pm)
dim_shift = deepcopy(dim)
_FP.shift_ids!(dim_shift, 24)
dt_shift = Dict{String,Any}("dim"=>dim_shift, "multinetwork"=>true, "nw"=>Dict{String,Any}("1"=>sn_data))
pm_shift = _PM.instantiate_model(dt_shift, _PM.ACPPowerModel, pm->nothing)
@testset "add_dimension!" begin
@test_throws ErrorException _FP.add_dimension!(sn_data, :hour, 4) # Trying to add a dimension having the same name of an existing one
@test_throws ErrorException _FP.add_dimension!(sn_data, :newdim, Dict(s => Dict{String,Any}("prop"=>"val") for s in [1,2,4])) # Trying to add a dimension having a property Dict whose keys are not consecutive Ints starting at 1
end
@testset "shift_ids!" begin
@test _FP.nw_ids(dim_shift) == collect(25:48)
@test _FP.shift_ids!(deepcopy(sn_data), 24) == collect(25:48)
@test_throws ErrorException _FP.shift_ids!(dt, 1) # Trying to shift ids of a multinetwork
end
@testset "merge_dim!" begin
dt1 = deepcopy(dt)
dt2 = deepcopy(dt)
delete!(dt2, "dim")
_FP.add_dimension!(dt2, :hour, 4)
_FP.add_dimension!(dt2, :sub_nw, 2; metadata = Dict{String,Any}("description"=>"sub_nws model different physical networks"))
_FP.add_dimension!(dt2, :scenario, Dict(s => Dict{String,Any}("probability"=>s/6) for s in 1:3))
@test_throws ErrorException _FP.merge_dim!(dt1["dim"], dt2["dim"], :sub_nw) # Dimensions are not sorted in the same way
dt1 = deepcopy(dt)
dt2 = deepcopy(dt)
_FP.dim_prop(dt2, :scenario, 1)["probability"] = 1/2
@test_throws ErrorException _FP.merge_dim!(dt1["dim"], dt2["dim"], :sub_nw) # Different property along a dimension that is not being merged
dt1 = deepcopy(dt)
dt2 = deepcopy(dt)
_FP.dim_meta(dt2, :sub_nw)["description"] = ""
@test_throws ErrorException _FP.merge_dim!(dt1["dim"], dt2["dim"], :sub_nw) # Different metadata
dt1 = deepcopy(dt)
sn_data_shift = deepcopy(sn_data)
_FP.shift_ids!(sn_data_shift, 23)
dt2 = Dict{String,Any}("dim"=>sn_data_shift["dim"], "multinetwork"=>true, "nw"=>Dict{String,Any}("1"=>sn_data))
@test_throws ErrorException _FP.merge_dim!(dt1["dim"], dt2["dim"], :sub_nw) # Ids are not contiguous
dt1 = deepcopy(dt)
sn_data_shift = deepcopy(sn_data)
_FP.shift_ids!(sn_data_shift, 25)
dt2 = Dict{String,Any}("dim"=>sn_data_shift["dim"], "multinetwork"=>true, "nw"=>Dict{String,Any}("1"=>sn_data))
@test_throws ErrorException _FP.merge_dim!(dt1["dim"], dt2["dim"], :sub_nw) # Ids are not contiguous
dt1 = deepcopy(dt)
dt2 = deepcopy(dt)
delete!(dt2, "dim")
_FP.add_dimension!(dt2, :hour, 4)
_FP.add_dimension!(dt2, :scenario, Dict(s => Dict{String,Any}("probability"=>s/6) for s in 1:3))
_FP.add_dimension!(dt2, :sub_nw, 4; metadata = Dict{String,Any}("description"=>"sub_nws model different physical networks"))
@test _FP.merge_dim!(dt1["dim"], dt_shift["dim"], :sub_nw) == dt2["dim"]
end
@testset "slice_dim" begin
slice, ids = _FP.slice_dim(dim, hour=2)
@test _FP.dim_length(slice) == 6
@test _FP.dim_length(slice, :hour) == 1
@test _FP.dim_length(slice, :scenario) == 3
@test _FP.dim_length(slice, :sub_nw) == 2
@test _FP.dim_meta(slice, :hour, "orig_id") == 2
@test ids == [2,6,10,14,18,22]
slice, ids = _FP.slice_dim(dim, hour=2, scenario=3)
@test _FP.dim_length(slice) == 2
@test _FP.dim_length(slice, :hour) == 1
@test _FP.dim_length(slice, :scenario) == 1
@test _FP.dim_length(slice, :sub_nw) == 2
@test _FP.dim_meta(slice, :hour, "orig_id") == 2
@test _FP.dim_meta(slice, :scenario, "orig_id") == 3
@test ids == [10,22]
end
@testset "nw_ids" begin
@test _FP.nw_ids(dim) == collect(1:24)
@test _FP.nw_ids(dim, hour=4) == [4,8,12,16,20,24]
@test _FP.nw_ids(dim, scenario=2) == [5,6,7,8,17,18,19,20]
@test _FP.nw_ids(dim, sub_nw=1) == [1,2,3,4,5,6,7,8,9,10,11,12]
@test _FP.nw_ids(dim, hour=4, scenario=2) == [8,20]
@test _FP.nw_ids(dim, hour=2:4) == [2,3,4,6,7,8,10,11,12,14,15,16,18,19,20,22,23,24]
@test _FP.nw_ids(dim, hour=2:4, scenario=2) == [6,7,8,18,19,20]
@test _FP.nw_ids(dim, hour=[2,4]) == [2,4,6,8,10,12,14,16,18,20,22,24]
@test _FP.nw_ids(dim, hour=[2,4], scenario=2) == [6,8,18,20]
@test _FP.nw_ids(dim_shift) == collect(25:48)
@test _FP.nw_ids(dt) == _FP.nw_ids(dim)
@test _FP.nw_ids(pm) == _FP.nw_ids(dim)
end
@testset "similar_ids" begin
@test _FP.similar_ids(dim, 7) == [7]
@test _FP.similar_ids(dim, 7, hour=4) == [8]
@test _FP.similar_ids(dim, 7, scenario=1) == [3]
@test _FP.similar_ids(dim, 7, hour=4, scenario=1) == [4]
@test _FP.similar_ids(dim, 7, hour=2:4) == [6,7,8]
@test _FP.similar_ids(dim, 7, hour=[2,4]) == [6,8]
@test _FP.similar_ids(dim, 7, scenario=1:3) == [3,7,11]
@test _FP.similar_ids(dim, 7, scenario=[1,3]) == [3,11]
@test _FP.similar_ids(dim, 7, hour=[2,4], scenario=1:3) == [2,4,6,8,10,12]
@test _FP.similar_ids(dim_shift, 31) == [31]
@test _FP.similar_ids(dt, 7) == _FP.similar_ids(dim, 7)
@test _FP.similar_ids(pm, 7) == _FP.similar_ids(dim, 7)
end
@testset "similar_id" begin
@test _FP.similar_id(dim, 7) == 7
@test _FP.similar_id(dim, 7, hour=4) == 8
@test _FP.similar_id(dim, 7, scenario=1) == 3
@test _FP.similar_id(dim, 7, hour=4, scenario=1) == 4
@test _FP.similar_id(dim_shift, 31) == 31
@test _FP.similar_id(dt, 7) == _FP.similar_id(dim, 7)
@test _FP.similar_id(pm, 7) == _FP.similar_id(dim, 7)
end
@testset "first_id" begin
@test _FP.first_id(dim, 17, :hour) == 17
@test _FP.first_id(dim, 18, :hour) == 17
@test _FP.first_id(dim, 19, :hour) == 17
@test _FP.first_id(dim, 20, :hour) == 17
@test _FP.first_id(dim, 16, :scenario) == 16
@test _FP.first_id(dim, 19, :scenario) == 15
@test _FP.first_id(dim, 22, :scenario) == 14
@test _FP.first_id(dim, 13, :hour, :scenario) == 13
@test _FP.first_id(dim, 16, :hour, :scenario) == 13
@test _FP.first_id(dim, 21, :hour, :scenario) == 13
@test _FP.first_id(dim, 24, :hour, :scenario) == 13
@test _FP.first_id(dim_shift, 41, :hour) == 41
@test _FP.first_id(dt, 17, :hour) == _FP.first_id(dim, 17, :hour)
@test _FP.first_id(pm, 17, :hour) == _FP.first_id(dim, 17, :hour)
end
@testset "last_id" begin
@test _FP.last_id(dim, 8, :hour) == 8
@test _FP.last_id(dim, 7, :hour) == 8
@test _FP.last_id(dim, 6, :hour) == 8
@test _FP.last_id(dim, 5, :hour) == 8
@test _FP.last_id(dim, 9, :scenario) == 9
@test _FP.last_id(dim, 6, :scenario) == 10
@test _FP.last_id(dim, 3, :scenario) == 11
@test _FP.last_id(dim, 12, :hour, :scenario) == 12
@test _FP.last_id(dim, 9, :hour, :scenario) == 12
@test _FP.last_id(dim, 4, :hour, :scenario) == 12
@test _FP.last_id(dim, 1, :hour, :scenario) == 12
@test _FP.last_id(dim_shift, 32, :hour) == 32
@test _FP.last_id(dt, 8, :hour) == _FP.last_id(dim, 8, :hour)
@test _FP.last_id(pm, 8, :hour) == _FP.last_id(dim, 8, :hour)
end
@testset "prev_id" begin
@test_throws BoundsError _FP.prev_id(dim, 17, :hour)
@test _FP.prev_id(dim, 18, :hour) == 17
@test _FP.prev_id(dim, 19, :hour) == 18
@test _FP.prev_id(dim, 20, :hour) == 19
@test_throws BoundsError _FP.prev_id(dim, 16, :scenario)
@test _FP.prev_id(dim, 19, :scenario) == 15
@test _FP.prev_id(dim, 22, :scenario) == 18
@test _FP.prev_id(dim_shift, 42, :hour) == 41
@test _FP.prev_id(dt, 18, :hour) == _FP.prev_id(dim, 18, :hour)
@test _FP.prev_id(pm, 18, :hour) == _FP.prev_id(dim, 18, :hour)
end
@testset "prev_ids" begin
@test _FP.prev_ids(dim, 17, :hour) == []
@test _FP.prev_ids(dim, 18, :hour) == [17]
@test _FP.prev_ids(dim, 20, :hour) == [17,18,19]
@test _FP.prev_ids(dim, 16, :scenario) == []
@test _FP.prev_ids(dim, 19, :scenario) == [15]
@test _FP.prev_ids(dim, 22, :scenario) == [14,18]
@test _FP.prev_ids(dim_shift, 42, :hour) == [41]
@test _FP.prev_ids(dt, 17, :hour) == _FP.prev_ids(dim, 17, :hour)
@test _FP.prev_ids(pm, 17, :hour) == _FP.prev_ids(dim, 17, :hour)
end
@testset "next_id" begin
@test _FP.next_id(dim, 5, :hour) == 6
@test _FP.next_id(dim, 6, :hour) == 7
@test _FP.next_id(dim, 7, :hour) == 8
@test_throws BoundsError _FP.next_id(dim, 8, :hour)
@test_throws BoundsError _FP.next_id(dim, 9, :scenario)
@test _FP.next_id(dim, 6, :scenario) == 10
@test _FP.next_id(dim, 3, :scenario) == 7
@test _FP.next_id(dim_shift, 29, :hour) == 30
@test _FP.next_id(dt, 5, :hour) == _FP.next_id(dim, 5, :hour)
@test _FP.next_id(pm, 5, :hour) == _FP.next_id(dim, 5, :hour)
end
@testset "next_ids" begin
@test _FP.next_ids(dim, 5, :hour) == [6,7,8]
@test _FP.next_ids(dim, 7, :hour) == [8]
@test _FP.next_ids(dim, 8, :hour) == []
@test _FP.next_ids(dim, 9, :scenario) == []
@test _FP.next_ids(dim, 6, :scenario) == [10]
@test _FP.next_ids(dim, 3, :scenario) == [7,11]
@test _FP.next_ids(dim_shift, 29, :hour) == [30,31,32]
@test _FP.next_ids(dt, 5, :hour) == _FP.next_ids(dim, 5, :hour)
@test _FP.next_ids(pm, 5, :hour) == _FP.next_ids(dim, 5, :hour)
end
@testset "coord" begin
@test _FP.coord(dim, 7, :hour) == 3
@test _FP.coord(dim, 7, :scenario) == 2
@test _FP.coord(dim_shift, 31, :hour) == 3
@test _FP.coord(dim_shift, 31, :scenario) == 2
@test _FP.coord(dt, 7, :hour) == _FP.coord(dim, 7, :hour)
@test _FP.coord(pm, 7, :hour) == _FP.coord(dim, 7, :hour)
end
@testset "is_first_id" begin
@test _FP.is_first_id(dim, 14, :hour) == false
@test _FP.is_first_id(dim, 14, :scenario) == true
@test _FP.is_first_id(dim, 17, :hour) == true
@test _FP.is_first_id(dim, 17, :scenario) == false
@test _FP.is_first_id(dim_shift, 38, :hour) == false
@test _FP.is_first_id(dim_shift, 38, :scenario) == true
@test _FP.is_first_id(dt, 14, :hour) == _FP.is_first_id(dim, 14, :hour)
@test _FP.is_first_id(pm, 14, :hour) == _FP.is_first_id(dim, 14, :hour)
end
@testset "is_last_id" begin
@test _FP.is_last_id(dim, 20, :hour) == true
@test _FP.is_last_id(dim, 20, :scenario) == false
@test _FP.is_last_id(dim, 21, :hour) == false
@test _FP.is_last_id(dim, 21, :scenario) == true
@test _FP.is_last_id(dim_shift, 44, :hour) == true
@test _FP.is_last_id(dim_shift, 44, :scenario) == false
@test _FP.is_last_id(dt, 20, :hour) == _FP.is_last_id(dim, 20, :hour)
@test _FP.is_last_id(pm, 20, :hour) == _FP.is_last_id(dim, 20, :hour)
end
@testset "has_dim" begin
@test _FP.has_dim(dim, :hour) == true
@test _FP.has_dim(dim, :newdim) == false
@test _FP.has_dim(dt, :hour) == _FP.has_dim(dim, :hour)
@test _FP.has_dim(pm, :hour) == _FP.has_dim(dim, :hour)
end
@testset "require_dim" begin
@test_throws ErrorException _FP.require_dim(Dict{String,Any}()) # Missing `dim` dict
@test_throws ErrorException _FP.require_dim(dt, :newdim) # Missing `newdim` dimension
end
@testset "dim_names" begin
@test _FP.dim_names(dim) == (:hour, :scenario, :sub_nw)
@test _FP.dim_names(dt) == _FP.dim_names(dim)
@test _FP.dim_names(pm) == _FP.dim_names(dim)
end
@testset "dim_prop" begin
@test Set(keys(_FP.dim_prop(dim))) == Set((:hour, :scenario, :sub_nw))
@test _FP.dim_prop(dim, :hour) == Dict(h => Dict{String,Any}() for h in 1:4)
@test _FP.dim_prop(dim, :scenario) == Dict(s => Dict{String,Any}("probability"=>s/6) for s in 1:3)
@test _FP.dim_prop(dim, :scenario, 1) == Dict{String,Any}("probability"=>1/6)
@test _FP.dim_prop(dim, :scenario, 1, "probability") == 1/6
@test _FP.dim_prop(dim, 13, :scenario) == Dict{String,Any}("probability"=>1/6)
@test _FP.dim_prop(dim, 13, :scenario, "probability") == 1/6
@test _FP.dim_prop(dt) == _FP.dim_prop(dim)
@test _FP.dim_prop(pm) == _FP.dim_prop(dim)
end
@testset "dim_meta" begin
@test Set(keys(_FP.dim_meta(dim))) == Set((:hour, :scenario, :sub_nw))
@test _FP.dim_meta(dim, :hour) == Dict{String,Any}()
@test _FP.dim_meta(dim, :sub_nw) == Dict{String,Any}("description" => "sub_nws model different physical networks")
@test _FP.dim_meta(dim, :sub_nw, "description") == "sub_nws model different physical networks"
@test _FP.dim_meta(dt) == _FP.dim_meta(dim)
@test _FP.dim_meta(pm) == _FP.dim_meta(dim)
end
@testset "dim_length" begin
@test _FP.dim_length(dim) == 24
@test _FP.dim_length(dim, :hour) == 4
@test _FP.dim_length(dim, :scenario) == 3
@test _FP.dim_length(dim_shift) == 24
@test _FP.dim_length(dt) == _FP.dim_length(dim)
@test _FP.dim_length(pm) == _FP.dim_length(dim)
end
Memento.setlevel!(Memento.getlogger(_FP), previous_FlexPlan_logger_level)
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 367 | # Test exported symbols
@testset "Export" begin
exported = Set(names(_FP))
@testset "JuMP" begin
@test :NO_SOLUTION ∈ exported # Sample check that `ResultStatusCode`s are exported
@test :OPTIMIZE_NOT_CALLED ∈ exported # Sample check that `TerminationStatusCode`s are exported
@test :optimizer_with_attributes ∈ exported
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 11916 | # Test flexible load model using a multiperiod optimization
# - Model: `_FP.BFARadPowerModel` is used to be able to test loads in both active and
# reactive power, while keeping the model linear. It requires a distribution network.
# - Problems: both `flex_tnep` and `simple_stoch_flex_tnep` are used because they have
# different flexible load models.
## Settings
file = normpath(@__DIR__,"..","test","data","case2","case2_d_flex.m") # Input case. Here 2-bus distribution network having 1 generator and 1 load, both on bus 1 (bus 2 is empty).
number_of_hours = 24 # Number of time periods
## Plot function
# Uncomment this part and the commented lines further down to display a nice plot when manually editing a testset
#=
using StatsPlots
function plot_flex_load(mn_data, result)
res_load(i,key) = [result["solution"]["nw"]["$n"]["load"]["$i"][key] for n in 1:number_of_hours]
data_load(i,key) = [mn_data["nw"]["$n"]["load"]["$i"][key] for n in 1:number_of_hours]
load_matrix = hcat(res_load.(1,["pshift_up" "pshift_down" "pred" "pcurt" "pflex"])...) # Rows: hours; columns: power categories
load_matrix[:,5] -= load_matrix[:,1] # The grey bar in the plot, needed to stack the other bars at the correct height.
plt = groupedbar(load_matrix;
yguide = "Power [p.u.]",
xguide = "Time [h]",
framestyle = :zerolines,
bar_position = :stack,
bar_width = 1,
linecolor = HSLA(0,0,1,0),
legend_position = :topleft,
label = ["pshift_up" "pshift_down" "pred" "pcurt" :none],
seriescolor = [HSLA(210,1,0.5,0.5) HSLA(0,0.75,0.75,0.5) HSLA(0,0.5,0.5,0.5) HSLA(0,0.75,0.25,0.5) HSLA(0,0,0,0.1)],
)
plot!(plt, data_load(1,"pd"); label="pd", seriestype=:stepmid, linecolor=:black, linewidth=2, linestyle=:dot)
plot!(plt, res_load(1,"pflex"); label="pflex", seriestype=:stepmid, linecolor=:black)
display(plt)
end
=#
## Test results
@testset "Load model" begin
# Case where there is a flexible load and it is activated. As demand exceeds by far
# available generation in the second half of the time horizon, demand shifting and
# voluntary reduction are exploited to their maximum extent. Involuntary curtailment
# covers the remaining excess of demand.
@testset "Flex load - active" begin
data = _FP.parse_file(file)
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
#plot_flex_load(mn_data, result)
@test result["solution"]["nw"][ "1"]["load"]["1"]["flex"] ≈ 1.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_up"] ≈ 6.957 rtol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_down"] ≈ 6.957 rtol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["ered"] ≈ 12.0 rtol=1e-3
for n in 1 : number_of_hours÷2
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pred"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pcurt"] ≈ 0.0 atol=1e-3
end
for n in number_of_hours÷2+1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pflex"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["qflex"] ≈ 2.0 rtol=1e-3
end
@test result["objective"] ≈ 163.2 rtol=1e-3
end
# Case where there is a flexible load but it is not activated. Demand exceeds available
# generation in the second half of the time horizon; involuntary curtailment is the only
# option to decrease the demand.
@testset "Flex load - not active" begin
data = _FP.parse_file(file)
data["load"]["1"]["cost_inv"] = 1e10 # Increase the cost of flexibility-enabling equipment so that flexibility is not enabled in optimal solution
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
#plot_flex_load(mn_data, result)
@test result["solution"]["nw"][ "1"]["load"]["1"]["flex"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["ered"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["pcurt"] ≈ 10.0 rtol=1e-3
for n in 1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pred"] ≈ 0.0 atol=1e-3
end
for n in 1 : number_of_hours÷2
@test result["solution"]["nw"]["$n"]["load"]["1"]["pcurt"] ≈ 0.0 atol=1e-3
end
for n in number_of_hours÷2+1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pflex"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["qflex"] ≈ 2.0 rtol=1e-3
end
@test result["objective"] ≈ 231.8 rtol=1e-3
end
# Case where there is a fixed load. Demand exceeds available generation in the second
# half of the time horizon; involuntary curtailment is the only option to decrease the
# demand.
@testset "Fixed load" begin
data = _FP.parse_file(file)
data["load"]["1"]["flex"] = 0 # State that the load cannot be made flexible
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
#plot_flex_load(mn_data, result)
@test result["solution"]["nw"][ "1"]["load"]["1"]["flex"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["eshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["ered"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["24"]["load"]["1"]["pcurt"] ≈ 10.0 rtol=1e-3
for n in 1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pred"] ≈ 0.0 atol=1e-3
end
for n in 1 : number_of_hours÷2
@test result["solution"]["nw"]["$n"]["load"]["1"]["pcurt"] ≈ 0.0 atol=1e-3
end
for n in number_of_hours÷2+1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pflex"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["qflex"] ≈ 2.0 rtol=1e-3
end
@test result["objective"] ≈ 231.8 rtol=1e-3
end
# Same case as "Flex load - active", with different load model:
# - there are no integral (energy) bounds on voluntary reduction and on demand shifting;
# - demand shifting has a periodic constraint that imposes a balance between upward and
# downward shifts.
@testset "Flex load - shifting periodic balance" begin
data = _FP.parse_file(file)
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :scenario, Dict(1 => Dict{String,Any}("probability"=>1)))
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
setting = Dict("demand_shifting_balance_period" => 9) # Not a divisor of 24, to verify that the balance constraint is also applied to the last period, which is not full length.
result = _FP.simple_stoch_flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer; setting)
#plot_flex_load(mn_data, result)
@test result["solution"]["nw"][ "1"]["load"]["1"]["flex"] ≈ 1.0 atol=1e-3
for n in 1 : number_of_hours÷2
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_down"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pred"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pcurt"] ≈ 0.0 atol=1e-3
end
for n in number_of_hours÷2+1 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["pflex"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["$n"]["load"]["1"]["qflex"] ≈ 2.0 rtol=1e-3
end
for n in 1 : 9
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
end
@test result["solution"]["nw"]["10"]["load"]["1"]["pshift_up"] ≈ 2.174 rtol=1e-3
@test result["solution"]["nw"]["11"]["load"]["1"]["pshift_up"] ≈ 1.304 rtol=1e-3
@test result["solution"]["nw"]["12"]["load"]["1"]["pshift_up"] ≈ 0.4348 rtol=1e-3
for n in 19 : number_of_hours
@test result["solution"]["nw"]["$n"]["load"]["1"]["pshift_up"] ≈ 0.0 atol=1e-3
end
@test result["objective"] ≈ 78.53 rtol=1e-3
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 6469 | # Test generator models using a multiperiod optimization
# - Model: `_FP.BFARadPowerModel` is used to be able to test generators in both active and
# reactive power, while keeping the model linear. It requires a distribution network.
# - Problem: `flex_tnep`.
## Settings
file = normpath(@__DIR__,"..","test","data","case2","case2_d_gen.m") # Input case. Here 2-bus distribution network having 1 dispatchable generator, 1 non-dispatchable generator and 1 fixed load, all on bus 1 (bus 2 is empty).
number_of_hours = 5 # Number of time periods
## Plot function
# Uncomment this part and the commented lines further down to display some nice plots when manually editing a testset
#=
using StatsPlots
function plot_bus(mn_data, result)
res_gen(i,key) = [result["solution"]["nw"]["$n"]["gen"]["$i"][key] for n in 1:number_of_hours]
data_gen(i,key) = [mn_data["nw"]["$n"]["gen"]["$i"][key] for n in 1:number_of_hours]
data_load(i,key) = [mn_data["nw"]["$n"]["load"]["$i"][key] for n in 1:number_of_hours]
gen_matrix = hcat(res_gen(1,"pgcurt"), res_gen(2,"pgcurt"), res_gen(1,"pg"), res_gen(2,"pg")) # Rows: hours; columns: power categories
plt = groupedbar(gen_matrix;
title = "Bus 1",
yguide = "Power [p.u.]",
xguide = "Time [h]",
framestyle = :zerolines,
bar_position = :stack,
bar_width = 1,
linecolor = HSLA(0,0,1,0),
legend_position = :topleft,
label = ["gen1 pgcurt" "gen2 pgcurt" "gen1 pg" "gen2 pg"],
seriescolor = [HSLA(0,0.5,0.5,0.5) HSLA(0,0.75,0.25,0.5) HSLA(210,0.75,0.5,0.5) HSLA(210,1,0.25,0.5)],
)
plot!(plt, data_load(1,"pd"); label="demand", seriestype=:stepmid, linecolor=:black, linewidth=2, linestyle=:dot)
display(plt)
end
function plot_gen(mn_data, result, i)
res_gen(i,key) = [result["solution"]["nw"]["$n"]["gen"]["$i"][key] for n in 1:number_of_hours]
data_gen(i,key) = [mn_data["nw"]["$n"]["gen"]["$i"][key] for n in 1:number_of_hours]
gen_matrix = hcat(res_gen.(i,["pgcurt" "pg"])...) # Rows: hours; columns: power categories
plt = groupedbar(gen_matrix;
title = "Generator $i",
yguide = "Power [p.u.]",
xguide = "Time [h]",
framestyle = :zerolines,
bar_position = :stack,
bar_width = 1,
linecolor = HSLA(0,0,1,0),
legend_position = :topleft,
label = ["pgcurt" "pg"],
seriescolor = [HSLA(0,0.75,0.25,0.5) HSLA(210,0.75,0.5,0.5)],
)
plot!(plt, data_gen(i,"pmax"); label="pmax", seriestype=:stepmid, linecolor=:black, linewidth=2, linestyle=:dot)
plot!(plt, data_gen(i,"pmin"); label="pmin", seriestype=:stepmid, linecolor=:black, linewidth=1, linestyle=:dash)
display(plt)
end
=#
## Test results
@testset "Generator model" begin
# The power required by a fixed load linearly increases from 0 to 20 MW. Generator 2 is
# non-dispatchable and its reference power decreases from 20 MW to 0 MW, so it is
# curtailed in the first half of the time horizon and used at full power in the second
# half. Generator 1, which is dispatchable and can range from 0 to 15 MW, covers the
# rest of the demand in subsequent periods, until reaches its maximum power; after that,
# the load is curtailed.
data = _FP.parse_file(file)
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
genprofile = hcat(1.5.*ones(number_of_hours), reverse(loadprofile; dims=1)) # Generator 1: 2 times the rated value; generator 2: ramp from 2 times the rated value to 0
time_series = _FP.make_time_series(data; loadprofile, genprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
#plot_bus(mn_data, result)
@testset "Dispatchable generator" begin
#plot_gen(mn_data, result, 1)
@test result["solution"]["nw"]["1"]["gen"]["1"]["pg"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["gen"]["1"]["pg"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["gen"]["1"]["pg"] ≈ 0.0 atol=1e-3 # Here demand is covered by generator 2, which is non-dispatchable
@test result["solution"]["nw"]["4"]["gen"]["1"]["pg"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["5"]["gen"]["1"]["pg"] ≈ 15.0 rtol=1e-3 # Must not exceed `pmax` even if the load requires more power
@test result["solution"]["nw"]["1"]["gen"]["1"]["pgcurt"] ≈ 0.0 atol=1e-3 # Dispatchable generators are not curtailable: `pgcurt` is always zero
@test result["solution"]["nw"]["2"]["gen"]["1"]["pgcurt"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["gen"]["1"]["pgcurt"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["gen"]["1"]["pgcurt"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["5"]["gen"]["1"]["pgcurt"] ≈ 0.0 atol=1e-3
end
@testset "Non-dispatchable generator" begin
#plot_gen(mn_data, result, 2)
@test result["solution"]["nw"]["1"]["gen"]["2"]["pg"] ≈ 0.0 atol=1e-3 # Curtailment is the only way to decrease generated power; here is completely exploited
@test result["solution"]["nw"]["2"]["gen"]["2"]["pg"] ≈ 5.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["gen"]["2"]["pg"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["4"]["gen"]["2"]["pg"] ≈ 5.0 rtol=1e-3 # Must not exceed `pmax`; the rest of the demand is covered by generator 1
@test result["solution"]["nw"]["5"]["gen"]["2"]["pg"] ≈ 0.0 atol=1e-3 # Must not exceed `pmax` even if the load requires more power
@test result["solution"]["nw"]["1"]["gen"]["2"]["pgcurt"] ≈ 20.0 rtol=1e-3 # Curtailment is the only way to decrease generated power; here is completely exploited
@test result["solution"]["nw"]["2"]["gen"]["2"]["pgcurt"] ≈ 10.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["gen"]["2"]["pgcurt"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["gen"]["2"]["pgcurt"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["5"]["gen"]["2"]["pgcurt"] ≈ 0.0 atol=1e-3
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 2503 | # Test IO functions provided by files in `src/io/`
@testset "Input-ouput" begin
# case6:
# - investments: AC branches, converters, DC branches, storage;
# - generators: with `pg>0`, non-dispatchable with `pcurt>0`;
case6_data = load_case6(number_of_hours=4, number_of_scenarios=1, number_of_years=1, scale_gen=13, share_data=false)
case6_result = _FP.simple_stoch_flex_tnep(case6_data, _PM.DCPPowerModel, milp_optimizer; setting=Dict("conv_losses_mp"=>false))
# ieee_33:
# - investments: AC branches, storage, flexible loads;
# - flexible loads: shift up, shift down, voluntary reduction, curtailment.
ieee_33_data = load_ieee_33(number_of_hours=4, number_of_scenarios=1, number_of_years=1, scale_load=1.52, share_data=false)
ieee_33_result = _FP.simple_stoch_flex_tnep(ieee_33_data, _FP.BFARadPowerModel, milp_optimizer)
@testset "scale_data!" begin
@testset "cost_scale_factor" begin
scale_factor = 1e-6
data = load_case6(number_of_hours=4, number_of_scenarios=1, number_of_years=1, scale_gen=13, cost_scale_factor=scale_factor)
result_scaled = _FP.simple_stoch_flex_tnep(data, _PM.DCPPowerModel, milp_optimizer; setting=Dict("conv_losses_mp"=>false))
@test result_scaled["objective"] ≈ scale_factor*case6_result["objective"] rtol=1e-5
data = load_ieee_33(number_of_hours=4, number_of_scenarios=1, number_of_years=1, scale_load=1.52, cost_scale_factor=scale_factor)
result_scaled = _FP.simple_stoch_flex_tnep(data, _FP.BFARadPowerModel, milp_optimizer)
@test result_scaled["objective"] ≈ scale_factor*ieee_33_result["objective"] rtol=1e-5
end
end
@testset "convert_mva_base!" begin
for mva_base_ratio in [0.01, 100]
data = deepcopy(case6_data)
mva_base = data["nw"]["1"]["baseMVA"] * mva_base_ratio
_FP.convert_mva_base!(data, mva_base)
result = _FP.simple_stoch_flex_tnep(data, _PM.DCPPowerModel, milp_optimizer; setting=Dict("conv_losses_mp"=>false))
@test result["objective"] ≈ case6_result["objective"] rtol=1e-5
data = deepcopy(ieee_33_data)
mva_base = data["nw"]["1"]["baseMVA"] * mva_base_ratio
_FP.convert_mva_base!(data, mva_base)
result = _FP.simple_stoch_flex_tnep(data, _FP.BFARadPowerModel, milp_optimizer)
@test result["objective"] ≈ ieee_33_result["objective"] rtol=1e-5
end
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3106 | # Test problem functions
# The purpose of the tests contained in this file is to detect if anything has accidentally
# changed in the problem functions. Accordingly, only the termination status and the
# objective value are tested.
# To test specific features, it is better to write ad-hoc tests in other files.
@testset "Problem" begin
t_data = load_case6(;
number_of_hours = 4,
number_of_scenarios = 2,
number_of_years = 3,
cost_scale_factor = 1e-6
)
t_data_1scenario = _FP.slice_multinetwork(t_data; scenario=1)
setting = Dict("conv_losses_mp" => false)
d_data = load_cigre_mv_eu(; # TODO: use a test case with multiple years and scenarios when available.
flex_load = true,
ne_storage = true,
scale_load = 6.0,
number_of_hours = 4,
cost_scale_factor = 1e-6
)
@testset "TNEP without flex loads" begin
@testset "Transmission" begin
result = _FP.strg_tnep(t_data_1scenario, _PM.DCPPowerModel, milp_optimizer; setting)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 7504.2 rtol=1e-3
end
@testset "Distribution" begin
result = _FP.strg_tnep(d_data, _FP.BFARadPowerModel, milp_optimizer)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 355.0 rtol=1e-3
end
end
@testset "TNEP" begin
@testset "Transmission" begin
result = _FP.flex_tnep(t_data_1scenario, _PM.DCPPowerModel, milp_optimizer; setting)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 7503.0 rtol=1e-3
end
@testset "Distribution" begin
result = _FP.flex_tnep(d_data, _FP.BFARadPowerModel, milp_optimizer)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 354.6 rtol=1e-3
end
end
@testset "Stochastic TNEP" begin
@testset "Transmission" begin
result = _FP.stoch_flex_tnep(t_data, _PM.DCPPowerModel, milp_optimizer; setting)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 7715.4 rtol=1e-3
end
@testset "Distribution" begin
result = _FP.stoch_flex_tnep(d_data, _FP.BFARadPowerModel, milp_optimizer)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 354.6 rtol=1e-3
end
end
@testset "Simplified stochastic TNEP" begin
@testset "Transmission" begin
result = _FP.simple_stoch_flex_tnep(t_data, _PM.DCPPowerModel, milp_optimizer; setting)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 7630.4 rtol=1e-3
end
@testset "Distribution" begin
result = _FP.simple_stoch_flex_tnep(d_data, _FP.BFARadPowerModel, milp_optimizer)
@test result["termination_status"] == OPTIMAL
@test result["objective"] ≈ 354.6 rtol=1e-3
end
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1036 | import FlexPlan as _FP
import PowerModelsACDC as _PMACDC
import PowerModels as _PM
import InfrastructureModels as _IM
using JuMP
using Memento
include(normpath(@__DIR__,"..","test","io","create_profile.jl"))
include(normpath(@__DIR__,"..","test","io","multiple_years.jl"))
include(normpath(@__DIR__,"..","test","io","load_case.jl"))
# Suppress warnings during testing.
Memento.setlevel!(Memento.getlogger(_IM), "error")
Memento.setlevel!(Memento.getlogger(_PMACDC), "error")
Memento.setlevel!(Memento.getlogger(_PM), "error")
using Test
import HiGHS
milp_optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
@testset "FlexPlan" begin
# FlexPlan components
include("dimensions.jl")
include("io.jl")
# Models
include("bfarad.jl")
# Network components
include("gen.jl")
include("flex_demand.jl")
include("storage.jl")
# Problems
include("prob.jl")
# Decompositions
include("td_decoupling.jl")
# Exported symbols
include("export.jl")
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 14320 | # Test storage model using a multiperiod optimization
# - Model: `_FP.BFARadPowerModel` is used to be able to test storage in both active and
# reactive power, while keeping the model linear. It requires a distribution network.
# - Problem: `flex_tnep` is the simplest problem that implements both storage and flexible
# loads.
## Settings
file = normpath(@__DIR__,"..","test","data","case2","case2_d_strg.m") # Input case. Here 2-bus distribution network having 1 generator, 1 load, 1 storage device, and 1 candidate storage device, all on bus 1 (bus 2 is empty).
number_of_hours = 4 # Number of time periods
## Plot function
# Uncomment this part and the commented lines further down to display a nice plot when manually editing a testset
#=
using StatsPlots
function plot_storage(mn_data, result; candidate::Bool, id::Int=1)
id = string(id)
storage_type = candidate ? "ne_storage" : "storage"
res_storage(key) = [result["solution"]["nw"]["$n"][storage_type][id][key] for n in 1:number_of_hours]
data_storage(key) = [mn_data["nw"]["$n"][storage_type][id][key] for n in 1:number_of_hours]
repeatfirst(x) = vcat(x[1:1,:], x)
p = plot(0:number_of_hours, repeatfirst(res_storage(candidate ? "ps_ne" : "ps"));
yguide = "Power [p.u.]",
xformatter = _ -> "",
legend = :none,
framestyle = :zerolines,
seriestype = :steppre,
fillrange = 0,
linewidth = 2,
seriescolor = HSLA(203,1,0.49,0.5),
)
plot!(p, 0:number_of_hours, hcat(repeatfirst(data_storage("charge_rating")), -repeatfirst(data_storage("discharge_rating")));
seriestype=:steppre, linewidth=2, linestyle=:dot, seriescolor=HSLA(203,1,0.49,1)
)
e = plot(0:number_of_hours, vcat(mn_data["nw"]["1"][storage_type][id]["energy"]*get(result["solution"]["nw"]["1"][storage_type][id],"isbuilt",1), res_storage(candidate ? "se_ne" : "se"));
yguide = "Energy [p.u.]",
xguide = "Time [h]",
legend = :none,
framestyle = :zerolines,
fillrange = 0,
linewidth = 2,
seriescolor = HSLA(15,0.73,0.58,0.5),
)
plot!(e, 0:number_of_hours, repeatfirst(data_storage("energy_rating"));
seriestype=:steppre, linewidth=2, linestyle=:dot, seriescolor=HSLA(15,0.73,0.58,1)
)
plt = plot(p, e; layout = (2,1), plot_title = (candidate ? "Candidate storage" : "Storage") * " $id" )
display(plt)
end
=#
## Test results
@testset "Storage model" begin
@testset "Common features" begin
# Case with a storage device and a candidate storage device. As demand exceeds by
# far the available generation in the second half of the time horizon, both storage
# devices are charged in the first half of the time horizon and discharged in the
# second half. Involuntary curtailment of the load covers the remaining excess of
# demand.
data = _FP.parse_file(file)
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
@testset "Existing storage" begin
#plot_storage(mn_data, result; candidate=false)
@test result["solution"]["nw"]["1"]["storage"]["1"]["sc"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["2"]["storage"]["1"]["sc"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["storage"]["1"]["sc"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["storage"]["1"]["sc"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["1"]["storage"]["1"]["sd"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["storage"]["1"]["sd"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["storage"]["1"]["sd"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["4"]["storage"]["1"]["sd"] ≈ 0.6098 rtol=1e-3 # Less than 1.0 because of "charge_efficiency", "discharge_efficiency" and "self_discharge_rate"
@test result["solution"]["nw"]["1"]["storage"]["1"]["ps"] ≈ 1.0 rtol=1e-3 # Storage model uses load convention: positive power when absorbed from the grid
@test result["solution"]["nw"]["2"]["storage"]["1"]["ps"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["storage"]["1"]["ps"] ≈ -1.0 rtol=1e-3 # Storage model uses load convention: negative power when injected into the grid
@test result["solution"]["nw"]["4"]["storage"]["1"]["ps"] ≈ -0.6098 rtol=1e-3
@test result["solution"]["nw"]["1"]["storage"]["1"]["se"] ≈ 2.898 rtol=1e-3 # Greater than 2.0 because refers to the end of period
@test result["solution"]["nw"]["2"]["storage"]["1"]["se"] ≈ 3.795 rtol=1e-3
@test result["solution"]["nw"]["3"]["storage"]["1"]["se"] ≈ 2.680 rtol=1e-3
@test result["solution"]["nw"]["4"]["storage"]["1"]["se"] ≈ 2.0 rtol=1e-3 # Must match "energy" parameter in data model
@test haskey(result["solution"]["nw"]["4"]["storage"]["1"], "e_abs") == false # Absorbed energy is not computed if is not bounded
end
@testset "Candidate storage" begin
#plot_storage(mn_data, result; candidate=true)
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["investment"] ≈ 1.0 atol=1e-3 # Invested in candidate storage device because it costs less than load curtailment
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["isbuilt"] ≈ 1.0 atol=1e-3 # Candidate storage device is built accordingly to investment decision
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["sc_ne"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["sc_ne"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["sd_ne"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["sd_ne"] ≈ 0.6098 rtol=1e-3 # Less than 1.0 because of "charge_efficiency", "discharge_efficiency" and "self_discharge_rate"
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["ps_ne"] ≈ 1.0 rtol=1e-3 # Storage model uses load convention: positive power when absorbed from the grid
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["ps_ne"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["ps_ne"] ≈ -1.0 rtol=1e-3 # Storage model uses load convention: negative power when injected into the grid
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["ps_ne"] ≈ -0.6098 rtol=1e-3
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["se_ne"] ≈ 2.898 rtol=1e-3 # Greater than 2.0 because refers to the end of period
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["se_ne"] ≈ 3.795 rtol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["se_ne"] ≈ 2.680 rtol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["se_ne"] ≈ 2.0 rtol=1e-3 # Must match "energy" parameter in data model
@test haskey(result["solution"]["nw"]["4"]["ne_storage"]["1"], "e_abs") == false # Absorbed energy is not computed if is not bounded
end
end
@testset "Bounded absorption" begin
# Same as base case, but the two storage devices have bounded absorption.
data = _FP.parse_file(file)
data["storage"]["1"]["max_energy_absorption"] = 1.0 # Limit the maximum energy absorption of existing storage device
data["ne_storage"]["1"]["max_energy_absorption"] = 1.0 # Limit the maximum energy absorption of candidate storage device
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,2;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 2 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
@testset "Existing storage" begin
#plot_storage(mn_data, result; candidate=false)
@test result["solution"]["nw"]["1"]["storage"]["1"]["ps"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["storage"]["1"]["ps"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["storage"]["1"]["ps"] ≈ -0.8020 rtol=1e-3
@test result["solution"]["nw"]["4"]["storage"]["1"]["ps"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["storage"]["1"]["e_abs"] ≈ 1.0 rtol=1e-3 # Must match "max_energy_absorption" parameter
end
@testset "Candidate storage" begin
#plot_storage(mn_data, result; candidate=true)
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["ps_ne"] ≈ 1.0 rtol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["ps_ne"] ≈ -0.8020 rtol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["e_abs_ne"] ≈ 1.0 rtol=1e-3 # Must match "max_energy_absorption" parameter
end
end
@testset "Candidate storage only" begin
# Case with a storage device and a candidate storage device. The high demand in
# period 4 requires using existing storage at full power and some load curtailment.
# The candidate storage device is not built even though it would avoid load
# curtailment because its construction costs more than load curtailment.
data = _FP.parse_file(file)
_FP.add_dimension!(data, :hour, number_of_hours)
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(data; cost_scale_factor=1e-6)
loadprofile = collect(reshape(range(0,1.1005;length=number_of_hours),:,1)) # Create a load profile: ramp from 0 to 1.1005 times the rated value of load
time_series = _FP.make_time_series(data; loadprofile) # Compute time series by multiplying the rated value by the profile
mn_data = _FP.make_multinetwork(data, time_series)
result = _FP.flex_tnep(mn_data, _FP.BFARadPowerModel, milp_optimizer)
@testset "Not built if not needed" begin
#plot_storage(mn_data, result; candidate=true)
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["investment"] ≈ 0.0 atol=1e-3 # Not invested in candidate storage device because it costs more than the small amount of load curtailment needed to satisfy all power bounds in the last period
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["isbuilt"] ≈ 0.0 atol=1e-3 # Candidate storage device is not built, accordingly to investment decision
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["sc_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["sd_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["ps_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["1"]["ne_storage"]["1"]["se_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["2"]["ne_storage"]["1"]["se_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["3"]["ne_storage"]["1"]["se_ne"] ≈ 0.0 atol=1e-3
@test result["solution"]["nw"]["4"]["ne_storage"]["1"]["se_ne"] ≈ 0.0 atol=1e-3 # Even if "energy" parameter in data model is positive, this must be zero because the candidate storage device is not built
end
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3606 | # Test decoupling of transmission and distribution networks
@testset "T&D decoupling" begin
number_of_hours = 4
number_of_distribution_networks = 2
cost_scale_factor = 1e-6
t_ref_extensions = [_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, _PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!]
d_ref_extensions = [_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _FP.ref_add_ne_branch_allbranches!, _FP.ref_add_frb_branch!, _FP.ref_add_oltc_branch!]
t_solution_processors = [_PM.sol_data_model!]
d_solution_processors = [_PM.sol_data_model!, _FP.sol_td_coupling!]
t_setting = Dict("conv_losses_mp" => false)
d_setting = Dict{String,Any}()
t_data = load_case6(; number_of_hours, number_of_scenarios=1, number_of_years=1, cost_scale_factor, share_data=false)
d_data_sub = load_cigre_mv_eu(; flex_load=true, ne_storage=true, scale_gen=5.0, scale_wind=6.0, scale_load=1.0, number_of_hours, cost_scale_factor)
d_data = Vector{Dict{String,Any}}(undef, number_of_distribution_networks)
for s in 1:number_of_distribution_networks
d_data[s] = deepcopy(d_data_sub)
d_data[s]["t_bus"] = mod1(s, length(first(values(t_data["nw"]))["bus"])) # Attach distribution network to a transmission network bus
end
@testset "calc_surrogate_model" begin
data = deepcopy(d_data[1])
d_gen_id = _FP._get_reference_gen(data)
_FP.add_dimension!(data, :sub_nw, Dict(1 => Dict{String,Any}("d_gen"=>d_gen_id)))
sol_up, sol_base, sol_down = _FP.TDDecoupling.probe_distribution_flexibility!(data;
model_type = _FP.BFARadPowerModel,
optimizer = milp_optimizer,
build_method = _FP.build_simple_stoch_flex_tnep,
ref_extensions = d_ref_extensions,
solution_processors = d_solution_processors
)
surrogate_distribution = _FP.TDDecoupling.calc_surrogate_model(d_data[1], sol_up, sol_base, sol_down)
surr_nw_1 = surrogate_distribution["nw"]["1"]
@test length(surr_nw_1["gen"]) == 1
@test length(surr_nw_1["load"]) == 1
@test length(surr_nw_1["storage"]) == 1
for (n,nw) in surrogate_distribution["nw"]
load = nw["load"]["1"]
@test load["pd"] ≥ 0.0
@test load["pshift_up_rel_max"] ≥ 0.0
@test load["pshift_down_rel_max"] ≥ 0.0
@test load["pred_rel_max"] ≥ 0.0
storage = nw["storage"]["1"]
@test storage["charge_rating"] ≥ 0.0
@test storage["discharge_rating"] ≥ 0.0
@test storage["stationary_energy_inflow"] ≥ 0.0
@test storage["stationary_energy_outflow"] ≥ 0.0
@test storage["thermal_rating"] ≥ 0.0
gen = nw["gen"]["1"]
@test gen["pmax"] ≥ 0.0
end
end
@testset "run_td_decoupling" begin
result = _FP.run_td_decoupling(
t_data, d_data, _PM.DCPPowerModel, _FP.BFARadPowerModel, milp_optimizer, milp_optimizer, _FP.build_simple_stoch_flex_tnep;
t_ref_extensions, d_ref_extensions, t_solution_processors, d_solution_processors, t_setting, d_setting
)
@test result["objective"] ≈ 2445.7 rtol=1e-3
@test length(result["d_solution"]) == number_of_distribution_networks
@test length(result["d_objective"]) == number_of_distribution_networks
end
end;
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1768 | # Utilities to compare decomposition solutions with benchmark solutions
function check_solution_correctness(benders_result, benchmark_result, obj_rtol, logger)
benders_opt = benders_result["objective"]
benchmark_opt = benchmark_result["objective"]
if !isapprox(benders_opt, benchmark_opt; rtol=obj_rtol)
warn(logger, @sprintf("Benders procedure failed to find an optimal solution within tolerance %.2e", obj_rtol))
warn(logger, @sprintf(" (benders % 15.9g, benchmark % 15.9g, rtol %.2e)", benders_opt, benchmark_opt, benders_opt/benchmark_opt-1))
end
comp_name = Dict{String,String}(
"ne_branch" => "AC branch",
"branchdc_ne" => "DC branch",
"convdc_ne" => "converter",
"ne_storage" => "storage",
"load" => "flex load"
)
benders_sol = Dict(year => benders_result["solution"]["nw"]["$n"] for (year, n) in enumerate(_FP.nw_ids(data; hour=1, scenario=1)))
benchmark_sol = Dict(year => benchmark_result["solution"]["nw"]["$n"] for (year, n) in enumerate(_FP.nw_ids(data; hour=1, scenario=1)))
for y in keys(benchmark_sol)
for (comp, name) in comp_name
if haskey(benchmark_sol[y], comp)
for idx in keys(benchmark_sol[y][comp])
benchmark_value = benchmark_sol[y][comp][idx]["investment"]
benders_value = benders_sol[y][comp][idx]["investment"]
if !isapprox(benders_value, benchmark_value, atol=1e-1)
warn(logger, "In year $y, the investment decision for $name $idx does not match (Benders $(round(Int,benders_value)), benchmark $(round(Int,benchmark_value)))")
end
end
end
end
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1113 | # Functions to interact with CPLEX.Optimizer
import CPLEX
import JuMP
# To be used instead of `CPLEX.Optimizer()` to set up a log file for the CPLEX optimizer
function CPLEX_optimizer_with_logger(log_file::String)
function CPLEX_opt_w_log() # Like CPLEX.Optimizer, but dumps to the specified log file
model = CPLEX.Optimizer()
CPLEX.CPXsetlogfilename(model.env, log_file, "w+")
return model
end
end
function get_cplex_optimizer(pm::_PM.AbstractPowerModel)
m1 = pm.model # JuMP.model
m2 = JuMP.backend(m1) # MathOptInterface.Utilities.CachingOptimizer{...}
m3 = m2.optimizer # MathOptInterface.Bridges.LazyBridgeOptimizer{CPLEX.Optimizer}
m4 = m3.model # CPLEX.Optimizer
return m4
end
function get_num_subproblems(annotation_file::String)
subproblems = 0
for line in eachline(annotation_file)
m = match(r"<anno .*value='(?<value>\d+)'.*/>", line)
if !isnothing(m)
val = parse(Int, m[:value])
if val > subproblems
subproblems = val
end
end
end
return subproblems
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 13954 | # Functions to make performance tests of decomposition implementations
using CSV
using DataFrames
using Dates
import JuMP
function initialize_tasks(params::Dict)
DataFrame([name=>type[] for (name,type) in [params[:case]; params[:optimization]]])
end
function add_tasks!(tasks::DataFrame; kwargs...)
names = keys(kwargs)
mismatched_names = symdiff(Set(propertynames(tasks)),Set(names))
if !isempty(mismatched_names)
Memento.error(_LOGGER, "The parameters of the tasks to be added do not match the defined parameters. Check \"" * join(string.(mismatched_names), "\", \"", "\" and \"") * "\".")
end
vals = [v isa Vector ? v : [v] for v in values(kwargs)]
for job_values in Iterators.product(vals...)
push!(tasks, Dict(Pair.(names, job_values)))
end
return unique!(tasks)
end
function load_case(case, case_settings)
data, model_type, ref_extensions, solution_processors, setting = eval(Symbol("load_$(case[:test_case])_defaultparams"))(; number_of_hours=case[:number_of_hours], number_of_scenarios=case[:number_of_scenarios], number_of_years=case[:number_of_years], case_settings...)
return Dict(:data=>data, :model_type=>model_type, :ref_extensions=>ref_extensions, :solution_processors=>solution_processors, :setting=>setting)
end
function run_and_time(
data::Dict{String,<:Any},
model_type::Type,
optimizer::Union{JuMP.MOI.AbstractOptimizer, JuMP.MOI.OptimizerWithAttributes},
build_method::Function;
kwargs...
)
time_start = time()
result = build_method(data, model_type, optimizer; kwargs...)
@assert result["termination_status"] ∈ (_FP.OPTIMAL, _FP.LOCALLY_SOLVED) "$(result["optimizer"]) termination status: $(result["termination_status"])"
result["time"] = Dict{String,Any}("total" => time()-time_start)
return result
end
function optimize_case(case_data, task, settings)
opt_s = settings[:optimization]
if task[:algorithm] ∈ ("manual_classical", "manual_modern")
optimizer_MILP = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(opt_s[:out_dir],"milp.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Preprocessing_RepeatPresolve" => get(task,:preprocessing_repeatpresolve,-1), # {-1,..., 3} -1 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-repeat-presolve-switch>
"CPXPARAM_MIP_Strategy_Search" => get(task,:mip_strategy_search,0), # { 0,..., 2} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-dynamic-search-switch>
"CPXPARAM_Emphasis_MIP" => get(task,:emphasis_mip,0), # { 0,..., 5} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-emphasis-switch>
"CPXPARAM_MIP_Strategy_NodeSelect" => get(task,:mip_strategy_nodeselect,1), # { 0,..., 3} 1 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-selection-strategy>
"CPXPARAM_MIP_Strategy_VariableSelect" => get(task,:mip_strategy_variableselect,0), # {-1,..., 4} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-variable-selection-strategy>
"CPXPARAM_MIP_Strategy_BBInterval" => get(task,:mip_strategy_bbinterval,7), # { 0, 1,...} 7 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-strategy-best-bound-interval>
"CPXPARAM_MIP_Strategy_Branch" => get(task,:mip_strategy_branch,0), # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-branching-direction>
"CPXPARAM_MIP_Strategy_Probe" => get(task,:mip_strategy_probe,0), # {-1,..., 3} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-probing-level>
"CPXPARAM_MIP_Tolerances_MIPGap" => opt_s[:obj_rtol], # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
optimizer_LP = _FP.optimizer_with_attributes(CPLEX.Optimizer, # Log file would be interleaved in case of multiple secondary problems. To enable logging, substitute `CPLEX.Optimizer` with: `CPLEX_optimizer_with_logger(<path_to_log_file>)`
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_LPMethod" => 2, # { 0,..., 6} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-algorithm-continuous-linear-problems>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
)
if task[:algorithm] == "manual_classical"
algo = _FP.Benders.Classical(; obj_rtol=opt_s[:obj_rtol], max_iter=opt_s[:max_iter], tightening_rtol=opt_s[:tightening_rtol], silent=opt_s[:silent])
else
algo = _FP.Benders.Modern(; max_iter=opt_s[:max_iter], tightening_rtol=opt_s[:tightening_rtol], silent=opt_s[:silent])
end
result = _FP.run_benders_decomposition(
algo,
case_data[:data], case_data[:model_type],
optimizer_MILP, optimizer_LP,
_FP.build_simple_stoch_flex_tnep_benders_main,
_FP.build_simple_stoch_flex_tnep_benders_secondary;
ref_extensions=case_data[:ref_extensions], solution_processors=case_data[:solution_processors], setting=case_data[:setting]
)
make_benders_plots(case_data[:data], result, opt_s[:out_dir]; display_plots=false)
elseif task[:algorithm] == "cplex_auto"
optimizer_cplex = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(opt_s[:out_dir],"cplex.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_MIP_Tolerances_MIPGap" => opt_s[:obj_rtol], # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_Benders_Strategy" => 3, # {-1,..., 3} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-strategy>
"CPXPARAM_Benders_WorkerAlgorithm" => 2, # { 0,..., 5} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-worker-algorithm>
"CPXPARAM_Benders_Tolerances_OptimalityCut" => 1e-6, # [1e-9,1e-1] 1e-6 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-optimality-cut-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
result = run_and_time(
case_data[:data], case_data[:model_type],
optimizer_cplex,
_FP.simple_stoch_flex_tnep;
ref_extensions=case_data[:ref_extensions], solution_processors=case_data[:solution_processors], setting=case_data[:setting]
)
elseif task[:algorithm] == "benchmark"
optimizer_benchmark = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(opt_s[:out_dir],"benchmark.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_MIP_Tolerances_MIPGap" => opt_s[:obj_rtol], # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
result = run_and_time(
case_data[:data], case_data[:model_type],
optimizer_benchmark,
_FP.simple_stoch_flex_tnep;
ref_extensions=case_data[:ref_extensions], solution_processors=case_data[:solution_processors], setting=case_data[:setting]
)
else
Memento.error(_LOGGER, "Algorithm \"$(task[:algorithm])\" not implemented.")
end
return result["termination_status"], result["time"]["total"]
end
function initialize_results(results_file::String, tasks::DataFrame)
results = similar(tasks, 0)
results[!, :task_start_time] = DateTime[]
results[!, :termination_status] = String[]
results[!, :time] = Float64[]
CSV.write(results_file, results)
return results
end
function run_performance_tests(tasks::DataFrame, params::Dict, settings::Dict; use_existing_results::Bool=true)
results_file = joinpath(settings[:session][:results_dir],"results.csv")
if use_existing_results && isfile(results_file)
results = CSV.read(results_file, DataFrame; pool=false, stringtype=String)
if setdiff(propertynames(results), [:task_start_time, :termination_status, :time]) != propertynames(tasks)
Memento.error(_LOGGER, "Results file \"$results_file\" has different fields than expected. Please remove it manually or adjust params to match.")
end
if nrow(results) == 0 # Since there is no data, CSV.read could not infer the column types. Overwrite the file.
results = initialize_results(results_file, tasks)
end
else
results = initialize_results(results_file, tasks)
end
n_tasks = nrow(tasks) * settings[:session][:repetitions]
n_curr_task = 0
tasks_by_case = groupby(tasks, [name for (name,type) in params[:case]]; sort=false)
for case in keys(tasks_by_case)
case_string = join(["$val" for val in case], "_")
info(_LOGGER, "Loading case $case_string...")
mkpath(joinpath(settings[:session][:tasks_dir], case_string))
case_log_file = joinpath(settings[:session][:tasks_dir], case_string, "load_$case_string.log")
rm(case_log_file; force=true)
switch_log_file(case_log_file)
case_data = load_case(case, settings[:case])
switch_log_file(main_log_file)
for task in eachrow(tasks_by_case[case])
existing_tasks_like_this = use_existing_results ? nrow(filter(row -> row[1:ncol(tasks)]==task, results)) : 0
for r in 1:settings[:session][:repetitions]
task_start_time = now(UTC)
n_curr_task += 1
optimization_string = join(["$(task[name])" for (name,type) in params[:optimization]], "_")
info(_LOGGER, "┌ $n_curr_task/$n_tasks: $case_string-$optimization_string ($r/$(settings[:session][:repetitions]))")
info(_LOGGER, "│ started at $(task_start_time)Z")
if r > existing_tasks_like_this
task_dir = settings[:optimization][:out_dir] = mkpath(joinpath(settings[:session][:tasks_dir], case_string, optimization_string, Dates.format(task_start_time,datetime_format)))
switch_log_file(joinpath(task_dir, "algorithm.log"))
termination_status, task_duration = optimize_case(case_data, task, settings)
if termination_status != _FP.OPTIMAL
Memento.warn(_LOGGER, "$case_string-$optimization_string: termination status is $(termination_status)")
end
switch_log_file(main_log_file)
push!(results, (task..., task_start_time, "$termination_status", task_duration))
CSV.write(results_file, DataFrame(last(results)); append=true)
info(_LOGGER, "└ completed in $(round(Int,task_duration)) s")
else
info(_LOGGER, "└ skipped (reusing existing result)")
end
end
end
end
return results
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 8621 | # Plots to analyze Benders decomposition procedure
using DataFrames
using StatsPlots
function make_benders_plots(data::Dict{String,Any}, result::Dict{String,Any}, out_dir::String; display_plots::Bool=true)
stat = result["stat"]
n_iter = length(stat)
ub = [stat[i]["value"]["ub"] for i in 1:n_iter]
lb = [stat[i]["value"]["lb"] for i in 1:n_iter]
objective = [stat[i]["value"]["sol_value"] for i in 1:n_iter]
objective_nonimproving = [stat[i]["value"]["current_best"] ? NaN : objective[i] for i in 1:n_iter]
objective_improving = [stat[i]["value"]["current_best"] ? objective[i] : NaN for i in 1:n_iter]
opt = result["objective"]
# Solution value versus iterations
plt = plot(1:n_iter, [ub, lb, objective_improving, objective_nonimproving];
label = ["UB" "LB" "improving solution" "non-improving solution"],
seriestype = [:steppost :steppost :scatter :scatter],
color = [3 2 1 HSL(0,0,0.5)],
ylims = [lb[ceil(Int,n_iter/5)], maximum(objective[ceil(Int,n_iter/3):n_iter])],
title = "Benders decomposition solutions",
ylabel = "Cost",
xlabel = "Iterations",
legend = :topright,
)
savefig(plt, joinpath(out_dir,"sol.svg"))
display_plots && display(plt)
# Binary variable values versus iterations
comp_name = Dict{String,String}(
"ne_branch" => "AC branch",
"branchdc_ne" => "DC branch",
"convdc_ne" => "converter",
"ne_storage" => "storage",
"load" => "flex load"
)
comp_var = Dict{String,Symbol}(
"ne_branch" => :branch_ne_investment,
"branchdc_ne" => :branchdc_ne_investment,
"convdc_ne" => :conv_ne_investment,
"ne_storage" => :z_strg_ne_investment,
"load" => :z_flex_investment
)
main_sol = Dict(i => Dict(year=>stat[i]["main"]["sol"][n] for (year,n) in enumerate(_FP.nw_ids(data; hour=1, scenario=1))) for i in 1:n_iter)
int_vars = DataFrame(name = String[], idx=Int[], year=Int[], legend = String[], values = Vector{Bool}[])
for year in 1:_FP.dim_length(data, :year)
for (comp, name) in comp_name
var = comp_var[comp]
if haskey(main_sol[1][year], var)
for idx in keys(main_sol[1][year][var])
push!(int_vars, (name, idx, year, "$name $idx (y$year)", [main_sol[i][year][var][idx] for i in 1:n_iter]))
end
end
end
end
sort!(int_vars, [:name, :idx, :year])
select!(int_vars, :legend, :values)
values_matrix = Array{Int}(undef, nrow(int_vars), n_iter)
for n in 1:nrow(int_vars)
values_matrix[n,:] = int_vars.values[n]
end
values_matrix_plot = values_matrix + repeat(2isfinite.(objective_improving)', nrow(int_vars))
# | value | color | invested in component? | improving iteration? |
# | ----- | ---------- | ---------------------- | -------------------- |
# | 0 | light grey | no | no |
# | 1 | dark grey | yes | no |
# | 2 | light blue | no | yes |
# | 3 | dark blue | yes | yes |
palette = cgrad([HSL(0,0,0.75), HSL(0,0,0.5), HSL(203,0.5,0.76), HSL(203,0.5,0.51)], 4, categorical = true)
plt = heatmap(1:n_iter, int_vars.legend, values_matrix_plot;
yflip = true,
yticks = nrow(int_vars) <= 50 ? :all : :auto,
title = "Investment decisions",
ylabel = "Components",
xlabel = "Iterations",
color = palette,
colorbar = :none,
#legend = :outerbottom
)
#for (idx, lab) in enumerate(["not built, non-improving iteration", "built, non-improving iteration", "not built, improving iteration", "built, improving iteration"])
# plot!([], [], seriestype=:shape, label=lab, color=palette[idx])
#end
savefig(plt, joinpath(out_dir,"intvars.svg"))
display_plots && display(plt)
# Solve time versus iterations
main_time = [stat[i]["time"]["main"] for i in 1:n_iter]
sec_time = [stat[i]["time"]["secondary"] for i in 1:n_iter]
other_time = [stat[i]["time"]["other"] for i in 1:n_iter]
plt1 = groupedbar(1:n_iter, [other_time sec_time main_time];
bar_position = :stack,
bar_width = n_iter < 50 ? 0.8 : 1.0,
color = [HSL(0,0,2//3) 2 1],
linewidth = n_iter < 50 ? 1 : 0,
title = "Solve time",
yguide = "Time [s]",
xguide = "Iterations",
legend = :none,
)
plt2 = groupedbar([result["time"]["build"] result["time"]["main"] result["time"]["secondary"] result["time"]["other"]];
bar_position = :stack,
orientation = :horizontal,
color = [HSL(0,0,1//3) 1 2 HSL(0,0,2//3)],
legend = :outerright,
label = ["build model" "main problem" "secondary problems" "other"],
grid = :none,
axis = :hide,
ticks = :none,
flip = true,
xguide = "Total time: $(round(Int,result["time"]["total"])) s — Threads: $(Threads.nthreads())",
xguidefontsize = 9,
)
plt = plot(plt1, plt2; layout = StatsPlots.grid(2,1; heights=[0.92, 0.08]))
savefig(plt, joinpath(out_dir,"time.svg"))
display_plots && display(plt)
return nothing
end
# Plot of time vs. `x` variable (keeping other variables fixed). Used in performance tests.
function scatter_time_vs_variable(results::DataFrame, results_dir::String, fixed_vars::Vector{Symbol}, group_var::Symbol, x_var::Symbol)
plots_data = groupby(results, fixed_vars)
for k in keys(plots_data)
data = select(plots_data[k], x_var, group_var, :time)
if length(unique(data[!,group_var]))>1 && length(unique(data[!,x_var]))>1
param_string = replace("$k"[12:end-1], '"' => "")
x_min, x_max = extrema(data[!,x_var])
x_logscale = x_min>0 && x_max/x_min > 10.0 # Whether to use log scale along x axis
y_min, y_max = extrema(data.time)
y_logscale = y_max/y_min > 10.0 # Whether to use log scale along y axis
plt = @df data scatter(data[!,x_var], :time; group=data[!,group_var],
title = replace(param_string, r"(.+?, .+?, .+?, .+?,) "=>s"\1\n"), # Insert a newline every 4 params
titlefontsize = 6,
xlabel = "$x_var",
xscale = x_logscale ? :log10 : :identity,
xminorgrid = x_logscale,
xticks = unique(data[!,x_var]),
xformatter = x -> "$(round(Int,x))",
ylabel = "Time [s]",
yscale = y_logscale ? :log10 : :identity,
yminorgrid = y_logscale,
ylim = [y_logscale ? -Inf : 0, Inf],
legend = :bottomright,
legendtitle = "$group_var"
)
#display(plt)
plot_name = join(["$val" for val in k], "_") * ".svg"
mkpath(joinpath(results_dir, "$group_var", "$x_var"))
savefig(plt, joinpath(results_dir, "$group_var", "$x_var", plot_name))
end
end
end
function make_benders_perf_plots(results::DataFrame, results_dir::String)
results_optimal = filter(row -> row.termination_status == "OPTIMAL", results)
if nrow(results) != nrow(results_optimal)
warn(_LOGGER, "Removed from analysis $(nrow(results)-nrow(results_optimal)) tests whose termination status is not OPTIMAL.")
end
param_variables = setdiff(propertynames(results_optimal), [:task_start_time, :termination_status, :time])
for group in param_variables
if length(unique(results_optimal[!,group])) > 1
for x in param_variables
if group≠x && eltype(results_optimal[!,x])<:Number && length(unique(results_optimal[!,x]))>1
fixed_vars = setdiff(param_variables, [group, x])
scatter_time_vs_variable(results_optimal, results_dir, fixed_vars, group, x)
end
end
end
end
info(_LOGGER, "Plots saved in \"$results_dir\".")
end
function make_benders_perf_plots(results_dir::String)
results = CSV.read(joinpath(results_dir, "results.csv"), DataFrame; pool=false, stringtype=String)
make_benders_perf_plots(results, results_dir)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1127 | import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
import Ipopt
import Memento
import JuMP
import Gurobi # needs startvalues for all variables!
import Juniper
file = normpath(@__DIR__,"../../test/data/case67/case67.m")
file_2030 = normpath(@__DIR__,"../../test/data/case67/case67_tnep_2030.m")
file_2040 = normpath(@__DIR__,"../../test/data/case67/case67_tnep_2040.m")
file_2050 = normpath(@__DIR__,"../../test/data/case67/case67_tnep_2050.m")
ipopt = JuMP.optimizer_with_attributes(Ipopt.Optimizer, "tol" => 1e-6, "print_level" => 0)
gurobi = JuMP.optimizer_with_attributes(Gurobi.Optimizer)
juniper = JuMP.optimizer_with_attributes(Juniper.Optimizer)
s = Dict("conv_losses_mp" => true)
resultAC = _PMACDC.run_acdcopf(file, _PM.ACPPowerModel, ipopt; setting = s)
resultDC = _PMACDC.run_acdcopf(file, _PM.DCPPowerModel, gurobi; setting = s)
result2030= _PMACDC.run_tnepopf(file_2030, _PM.DCPPowerModel, gurobi, setting = s)
result2040= _PMACDC.run_tnepopf(file_2040, _PM.DCPPowerModel, gurobi, setting = s)
result2050= _PMACDC.run_tnepopf(file_2050, _PM.DCPPowerModel, gurobi, setting = s) | FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 14852 | import CSV
import DataFrames
import JSON
# Kept for compatibility with legacy code.
function create_profile_data(number_of_periods, data, loadprofile = ones(length(data["load"]),number_of_periods), genprofile = ones(length(data["gen"]),number_of_periods))
_FP.make_time_series(data, number_of_periods; loadprofile = permutedims(loadprofile), genprofile = permutedims(genprofile))
end
function create_profile_data_italy!(data)
hours = _FP.dim_length(data, :hour)
scenarios = _FP.dim_length(data, :scenario)
genprofile = ones(length(data["gen"]), hours*scenarios)
loadprofile = ones(length(data["load"]), hours*scenarios)
monte_carlo = get(_FP.dim_meta(data, :scenario), "mc", false)
for (s, scnr) in _FP.dim_prop(data, :scenario)
pv_sicily, pv_south_central, wind_sicily = read_res_data(s; mc = monte_carlo)
demand_center_north_pu, demand_north_pu, demand_center_south_pu, demand_south_pu, demand_sardinia_pu = read_demand_data(s; mc = monte_carlo)
start_idx = (s-1) * hours
if monte_carlo == false
for h in 1 : hours
h_idx = scnr["start"] + ((h-1) * 3600000)
genprofile[4, start_idx + h] = pv_south_central["data"]["$h_idx"]["electricity"]
genprofile[5, start_idx + h] = pv_sicily["data"]["$h_idx"]["electricity"]
genprofile[6, start_idx + h] = wind_sicily["data"]["$h_idx"]["electricity"]
end
else
genprofile[4, start_idx + 1 : start_idx + hours] = pv_south_central[1: hours]
genprofile[5, start_idx + 1 : start_idx + hours] = pv_sicily[1: hours]
genprofile[6, start_idx + 1 : start_idx + hours] = wind_sicily[1: hours]
end
loadprofile[:, start_idx + 1 : start_idx + hours] = [demand_center_north_pu'; demand_north_pu'; demand_center_south_pu'; demand_south_pu'; demand_sardinia_pu'][:, 1: hours]
# loadprofile[:, start_idx + 1 : start_idx + number_of_hours] = repeat([demand_center_north_pu'; demand_north_pu'; demand_center_south_pu'; demand_south_pu'; demand_sardinia_pu'][:, 1],1,number_of_hours)
end
# Add bus locations to data dictionary
data["bus"]["1"]["lat"] = 43.4894; data["bus"]["1"]["lon"] = 11.7946; # Italy central north
data["bus"]["2"]["lat"] = 45.3411; data["bus"]["2"]["lon"] = 9.9489; # Italy north
data["bus"]["3"]["lat"] = 41.8218; data["bus"]["3"]["lon"] = 13.8302; # Italy central south
data["bus"]["4"]["lat"] = 40.5228; data["bus"]["4"]["lon"] = 16.2155; # Italy south
data["bus"]["5"]["lat"] = 40.1717; data["bus"]["5"]["lon"] = 9.0738; # Sardinia
data["bus"]["6"]["lat"] = 37.4844; data["bus"]["6"]["lon"] = 14.1568; # Sicily
# Return info
return data, loadprofile, genprofile
end
function create_profile_data_germany!(data)
hours = _FP.dim_length(data, :hour)
scenarios = _FP.dim_length(data, :scenario)
genprofile = ones(length(data["gen"]), hours*scenarios)
loadprofile = ones(length(data["load"]), hours*scenarios)
monte_carlo = get(_FP.dim_meta(data, :scenario), "mc", false)
for (s, scnr) in _FP.dim_prop(data, :scenario)
wind_profile = read_res_data(s; mc = monte_carlo, country = "de")
demand_profile = read_demand_data(s; mc = monte_carlo, country = "de")
start_idx = (s-1) * hours
if monte_carlo == false
for h in 1 : hours
h_idx = scnr["start"] + ((h-1) * 3600000)
genprofile[2, start_idx + h] = wind_profile["2"]["data"]["$h_idx"]["electricity"]
genprofile[4, start_idx + h] = wind_profile["5"]["data"]["$h_idx"]["electricity"]
genprofile[20, start_idx + h] = wind_profile["67"]["data"]["$h_idx"]["electricity"]
if length(data["gen"]) > 20
genprofile[21, start_idx + h] = wind_profile["23"]["data"]["$h_idx"]["electricity"]
elseif length(data["gen"]) > 21
genprofile[22, start_idx + h] = wind_profile["54"]["data"]["$h_idx"]["electricity"]
end
end
end
loadprofile[:, start_idx + 1 : start_idx + hours] .= repeat(demand_profile[1:hours]', size(loadprofile, 1))
end
# Return info
return data, loadprofile, genprofile
end
"Create load and generation profiles for CIGRE distribution network."
function create_profile_data_cigre(data, number_of_hours; start_period = 1, scale_load = 1.0, scale_gen = 1.0, file_profiles_pu = normpath(@__DIR__,"..","data","cigre_mv_eu","time_series","CIGRE_profiles_per_unit.csv"))
## Fixed parameters
file_load_ind = normpath(@__DIR__,"..","data","cigre_mv_eu","CIGRE_industrial_loads.csv")
file_load_res = normpath(@__DIR__,"..","data","cigre_mv_eu","CIGRE_residential_loads.csv")
scale_unit = 0.001 # scale factor from CSV power data to FlexPlan power base: here converts from kVA to MVA
## Import data
load_ind = CSV.read(file_load_ind, DataFrames.DataFrame)
load_res = CSV.read(file_load_res, DataFrames.DataFrame)
profiles_pu = CSV.read(
file_profiles_pu,
DataFrames.DataFrame;
skipto = start_period + 1, # +1 is for header line
limit = number_of_hours,
ntasks = 1 # To ensure exact row limit is applied
)
if DataFrames.nrow(profiles_pu) < number_of_hours
Memento.error(_LOGGER, "insufficient number of rows in file \"$file_profiles_pu\" ($number_of_hours requested, $(DataFrames.nrow(profiles_pu)) found)")
end
DataFrames.select!(profiles_pu,
:industrial_load => :load_ind,
:residential_load => :load_res,
:photovoltaic => :pv,
:wind_turbine => :wind,
:fuel_cell => :fuel_cell,
:CHP_diesel => :chp_diesel,
:CHP_fuel_cell => :chp_fuel_cell
)
profiles_pu = Dict(pairs(eachcol(profiles_pu)))
## Prepare output structure
extradata = Dict{String,Any}()
## Loads
# Compute active and reactive power base of industrial loads
DataFrames.rename!(load_ind, [:bus, :s, :cosϕ])
load_ind.p_ind = scale_load * scale_unit * load_ind.s .* load_ind.cosϕ
load_ind.q_ind = scale_load * scale_unit * load_ind.s .* sin.(acos.(load_ind.cosϕ))
DataFrames.select!(load_ind, :bus, :p_ind, :q_ind)
# Compute active and reactive power base of residential loads
DataFrames.rename!(load_res, [:bus, :s, :cosϕ])
load_res.p_res = scale_load * scale_unit * load_res.s .* load_res.cosϕ
load_res.q_res = scale_load * scale_unit * load_res.s .* sin.(acos.(load_res.cosϕ))
DataFrames.select!(load_res, :bus, :p_res, :q_res)
# Create a table of industrial and residential power bases, indexed by the load ids used by `data`
load_base = coalesce.(DataFrames.outerjoin(load_ind, load_res; on=:bus), 0.0)
load_base.bus = string.(load_base.bus)
bus_load_lookup = Dict{String,String}()
for (l, load) in data["load"]
bus_load_lookup["$(load["load_bus"])"] = l
end
DataFrames.transform!(load_base, :bus => DataFrames.ByRow(b -> bus_load_lookup[b]) => :load_id)
# Compute active and reactive power profiles of each load
extradata["load"] = Dict{String,Any}()
for l in eachrow(load_base)
extradata["load"][l.load_id] = Dict{String,Any}()
extradata["load"][l.load_id]["pd"] = l.p_ind .* profiles_pu[:load_ind] .+ l.p_res .* profiles_pu[:load_res]
extradata["load"][l.load_id]["qd"] = l.q_ind .* profiles_pu[:load_ind] .+ l.q_res .* profiles_pu[:load_res]
end
## Generators
# Define a Dict for the technology of generators, indexed by the gen ids used by `data`
gen_tech = Dict{String,Symbol}()
gen_tech["2"] = :pv
gen_tech["3"] = :pv
gen_tech["4"] = :pv
gen_tech["5"] = :fuel_cell
gen_tech["6"] = :pv
gen_tech["7"] = :wind
gen_tech["8"] = :pv
gen_tech["9"] = :pv
gen_tech["10"] = :chp_diesel
gen_tech["11"] = :chp_fuel_cell
gen_tech["12"] = :pv
gen_tech["13"] = :fuel_cell
gen_tech["14"] = :pv
# Compute active and reactive power profiles of each generator
extradata["gen"] = Dict{String,Any}()
for (g, gen) in data["gen"]
if haskey(gen_tech, g)
extradata["gen"][g] = Dict{String,Any}()
extradata["gen"][g]["pmax"] = scale_gen * gen["pmax"] .* profiles_pu[gen_tech[g]]
extradata["gen"][g]["pmin"] = scale_gen * gen["pmin"] .* profiles_pu[gen_tech[g]]
extradata["gen"][g]["qmax"] = scale_gen * gen["qmax"] .* ones(number_of_hours)
extradata["gen"][g]["qmin"] = scale_gen * gen["qmin"] .* ones(number_of_hours)
end
end
return extradata
end
function read_demand_data(year; mc = false, country = "it")
if country == "it"
if mc == false
if year > 2
error("Only 2 scenarios are supported")
end
y = year + 2017
# Read demand CSV files
demand_north = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/demand_north_$y.csv"),DataFrames.DataFrame)[:,3]
demand_center_north = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/demand_center_north_$y.csv"),DataFrames.DataFrame)[:,3]
demand_center_south = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/demand_center_south_$y.csv"),DataFrames.DataFrame)[:,3]
demand_south = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/demand_south_$y.csv"),DataFrames.DataFrame)[:,3]
demand_sardinia = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/demand_sardinia_$y.csv"),DataFrames.DataFrame)[:,3]
# Convert demand_profile to pu of maxximum
demand_north_pu = demand_north ./ maximum(demand_north)
demand_center_north_pu = demand_center_north ./ maximum(demand_center_north)
demand_south_pu = demand_south ./ maximum(demand_south)
demand_center_south_pu = demand_center_south ./ maximum(demand_center_south)
demand_sardinia_pu = demand_sardinia ./ maximum(demand_sardinia)
else
if year > 35
error("Only 35 scenarios are supported")
end
y = year - 1
demand_north_pu = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_demand_$y.csv"),DataFrames.DataFrame)[:,3]
demand_center_north_pu = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_demand_$y.csv"),DataFrames.DataFrame)[:,2]
demand_center_south_pu = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_demand_$y.csv"),DataFrames.DataFrame)[:,4]
demand_south_pu = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_demand_$y.csv"),DataFrames.DataFrame)[:,5]
demand_sardinia_pu = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_demand_$y.csv"),DataFrames.DataFrame)[:,6]
end
return demand_north_pu, demand_center_north_pu, demand_center_south_pu, demand_south_pu, demand_sardinia_pu
elseif country == "de"
if year > 3
error("Only 3 scenarios are supported")
end
y = year + 2016
demand = CSV.read(normpath(@__DIR__,"../../test/data/case67/time_series/demand$y.csv"),DataFrames.DataFrame)[:,3]
demand_pu = demand ./ maximum(demand)
return demand_pu[1:4:end]
end
end
function read_res_data(year; mc = false, country = "it")
if country == "it"
if mc == false
if year > 2
error("Only 2 scenarios are supported")
end
y = year + 2017
pv_sicily = Dict()
open(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/pv_sicily_$y.json")) do f
dicttxt = read(f, String) # file information to string
pv_sicily = JSON.parse(dicttxt) # parse and transform data
end
pv_south_central = Dict()
open(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/pv_south_central_$y.json")) do f
dicttxt = read(f, String) # file information to string
pv_south_central = JSON.parse(dicttxt) # parse and transform data
end
wind_sicily = Dict()
open(normpath(@__DIR__,"../../test/data/case6/time_series/mc_false/wind_sicily_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_sicily = JSON.parse(dicttxt) # parse and transform data
end
else
if year > 35
error("Only 35 scenarios are supported")
end
y = year - 1
pv_sicily = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_PV_$y.csv"),DataFrames.DataFrame)[:,7]
pv_south_central = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_PV_$y.csv"),DataFrames.DataFrame)[:,4]
wind_sicily = CSV.read(normpath(@__DIR__,"../../test/data/case6/time_series/mc_true/case_6_wind_$y.csv"),DataFrames.DataFrame)[:,7]
end
return pv_sicily, pv_south_central, wind_sicily
elseif country == "de"
if year > 3
error("Only 3 scenarios are supported")
end
y = year + 2016
wind_profile = Dict{String, Any}()
open(normpath(@__DIR__,"../../test/data/case67/time_series/wind_bus2_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_profile["2"] = JSON.parse(dicttxt) # parse and transform data
end
open(normpath(@__DIR__,"../../test/data/case67/time_series/wind_bus5_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_profile["5"] = JSON.parse(dicttxt) # parse and transform data
end
open(normpath(@__DIR__,"../../test/data/case67/time_series/wind_bus23_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_profile["23"] = JSON.parse(dicttxt) # parse and transform data
end
open(normpath(@__DIR__,"../../test/data/case67/time_series/wind_bus54_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_profile["54"] = JSON.parse(dicttxt) # parse and transform data
end
open(normpath(@__DIR__,"../../test/data/case67/time_series/wind_bus67_$y.json")) do f
dicttxt = read(f, String) # file information to string
wind_profile["67"] = JSON.parse(dicttxt) # parse and transform data
end
return wind_profile
end
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 13483 | # Functions to load complete test cases from files hosted in the repository
include("create_profile.jl")
include("multiple_years.jl")
"""
load_case6(<keyword arguments>)
Load `case6`, a 6-bus transmission network with data contributed by FlexPlan researchers.
# Arguments
- `flex_load::Bool = true`: toggles flexibility of loads.
- `scale_gen::Real = 1.0`: scale factor of all generators.
- `scale_load::Real = 1.0`: scale factor of loads.
- `number_of_hours::Int = 8760`: number of hourly optimization periods.
- `number_of_scenarios::Int = 35`: number of scenarios (different time series for loads and
RES generators).
- `number_of_years::Int = 3`: number of years (different investment sets).
- `year_scale_factor::Int = 10`: how many years a representative year should represent.
- `cost_scale_factor::Real = 1.0`: scale factor for all costs.
- `init_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
target dict after its initialization. They must have exactly one argument (the target
dict) and can modify it; the return value is unused.
- `sn_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
single-network dictionaries containing data for each single year, just before
`_FP.make_multinetwork` is called. They must have exactly one argument (the single-network
dict) and can modify it; the return value is unused.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
"""
function load_case6(;
flex_load::Bool = true,
scale_gen::Real = 1.0,
scale_load::Real = 1.0,
number_of_hours::Int = 8760,
number_of_scenarios::Int = 35,
number_of_years::Int = 3,
year_scale_factor::Int = 10, # years
cost_scale_factor::Real = 1.0,
init_data_extensions::Vector{<:Function} = Function[],
sn_data_extensions::Vector{<:Function} = Function[],
share_data::Bool = true,
)
if !flex_load
function fixed_load!(data)
for load in values(data["load"])
load["flex"] = 0
end
end
push!(sn_data_extensions, fixed_load!)
end
if scale_gen ≠ 1.0
push!(sn_data_extensions, data_scale_gen(scale_gen))
end
if scale_load ≠ 1.0
push!(sn_data_extensions, data_scale_load(scale_load))
end
return create_multi_year_network_data("case6", number_of_hours, number_of_scenarios, number_of_years; year_scale_factor, cost_scale_factor, init_data_extensions, sn_data_extensions, share_data, mc=true)
end
"""
data, model_type, ref_extensions, solution_processors, setting = load_case6_defaultparams(<keyword arguments>)
Load `case6` in `data` and use default values for the other returned values.
See also: `load_case6`.
"""
function load_case6_defaultparams(; kwargs...)
load_case6(; kwargs...), load_params_defaults_transmission()...
end
"""
load_case67(<keyword arguments>)
Load `case67`, a 67-bus transmission network with data contributed by FlexPlan researchers.
# Arguments
- `flex_load::Bool = true`: toggles flexibility of loads.
- `scale_gen::Real = 1.0`: scale factor of all generators.
- `scale_load::Real = 1.0`: scale factor of loads.
- `number_of_hours::Int = 8760`: number of hourly optimization periods.
- `number_of_scenarios::Int = 3`: number of scenarios (different time series for loads and
RES generators).
- `number_of_years::Int = 3`: number of years (different investment sets).
- `year_scale_factor::Int = 10`: how many years a representative year should represent.
- `cost_scale_factor::Real = 1.0`: scale factor for all costs.
- `init_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
target dict after its initialization. They must have exactly one argument (the target
dict) and can modify it; the return value is unused.
- `sn_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
single-network dictionaries containing data for each single year, just before
`_FP.make_multinetwork` is called. They must have exactly one argument (the single-network
dict) and can modify it; the return value is unused.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
"""
function load_case67(;
flex_load::Bool = true,
scale_gen::Real = 1.0,
scale_load::Real = 1.0,
number_of_hours::Int = 8760,
number_of_scenarios::Int = 3,
number_of_years::Int = 3,
year_scale_factor::Int = 10, # years
cost_scale_factor::Real = 1.0,
init_data_extensions::Vector{<:Function} = Function[],
sn_data_extensions::Vector{<:Function} = Function[],
share_data::Bool = true,
)
if !flex_load
function fixed_load!(data)
for load in values(data["load"])
load["flex"] = 0
end
end
push!(sn_data_extensions, fixed_load!)
end
if scale_gen ≠ 1.0
push!(sn_data_extensions, data_scale_gen(scale_gen))
end
if scale_load ≠ 1.0
push!(sn_data_extensions, data_scale_load(scale_load))
end
return create_multi_year_network_data("case67", number_of_hours, number_of_scenarios, number_of_years; year_scale_factor, cost_scale_factor, init_data_extensions, sn_data_extensions, share_data)
end
"""
data, model_type, ref_extensions, solution_processors, setting = load_case67_defaultparams(<keyword arguments>)
Load `case67` in `data` and use default values for the other returned values.
See also: `load_case67`.
"""
function load_case67_defaultparams(; kwargs...)
load_case67(; kwargs...), load_params_defaults_transmission()...
end
"""
load_cigre_mv_eu(<keyword arguments>)
Load an extended version of CIGRE MV benchmark network (EU configuration).
Source: <https://e-cigre.org/publication/ELT_273_8-benchmark-systems-for-network-integration-of-renewable-and-distributed-energy-resources>, chapter 6.2
Extensions:
- storage at bus 14 (in addition to storage at buses 5 and 10, already present);
- candidate storage at buses 5, 10, and 14;
- time series (8760 hours) for loads and RES generators.
# Arguments
- `flex_load::Bool = false`: toggles flexibility of loads.
- `ne_storage::Bool = false`: toggles candidate storage.
- `scale_gen::Real = 1.0`: scale factor of all generators, wind included.
- `scale_wind::Real = 1.0`: further scaling factor of wind generator.
- `scale_load::Real = 1.0`: scale factor of loads.
- `number_of_hours::Int = 8760`: number of hourly optimization periods.
- `start_period::Int = 1`: first period of time series to use.
- `year_scale_factor::Int = 10`: how many years a representative year should represent [years].
- `energy_cost::Real = 50.0`: cost of energy exchanged with transmission network [€/MWh].
- `cost_scale_factor::Real = 1.0`: scale factor for all costs.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
"""
function load_cigre_mv_eu(;
flex_load::Bool = false,
ne_storage::Bool = false,
scale_gen::Real = 1.0,
scale_wind::Real = 1.0,
scale_load::Real = 1.0,
number_of_hours::Int = 8760,
start_period::Int = 1,
year_scale_factor::Int = 10, # years
energy_cost::Real = 50.0, # €/MWh
cost_scale_factor::Real = 1.0,
share_data::Bool = true,
)
grid_file = normpath(@__DIR__,"..","data","cigre_mv_eu","cigre_mv_eu_more_storage.m")
sn_data = _FP.parse_file(grid_file)
_FP.add_dimension!(sn_data, :hour, number_of_hours)
_FP.add_dimension!(sn_data, :scenario, Dict(1 => Dict{String,Any}("probability"=>1)))
_FP.add_dimension!(sn_data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>year_scale_factor))
# Set cost of energy exchanged with transmission network
sn_data["gen"]["1"]["ncost"] = 2
sn_data["gen"]["1"]["cost"] = [energy_cost, 0.0]
# Scale wind generation
sn_data["gen"]["6"]["pmin"] *= scale_wind
sn_data["gen"]["6"]["pmax"] *= scale_wind
sn_data["gen"]["6"]["qmin"] *= scale_wind
sn_data["gen"]["6"]["qmax"] *= scale_wind
# Toggle flexible demand
for load in values(sn_data["load"])
load["flex"] = flex_load ? 1 : 0
end
# Toggle candidate storage
if !ne_storage
sn_data["ne_storage"] = Dict{String,Any}()
end
_FP.scale_data!(sn_data; cost_scale_factor)
d_time_series = create_profile_data_cigre(sn_data, number_of_hours; start_period, scale_load, scale_gen, file_profiles_pu=normpath(@__DIR__,"..","data","cigre_mv_eu","time_series","CIGRE_profiles_per_unit_Italy.csv"))
d_mn_data = _FP.make_multinetwork(sn_data, d_time_series; share_data)
return d_mn_data
end
"""
data, model_type, ref_extensions, solution_processors, setting = load_cigre_mv_eu_defaultparams(<keyword arguments>)
Load `cigre_mv_eu` in `data` and use default values for the other returned values.
See also: `load_cigre_mv_eu`.
"""
function load_cigre_mv_eu_defaultparams(; kwargs...)
load_cigre_mv_eu(; kwargs...), load_params_defaults_distribution()...
end
"""
load_ieee_33(<keyword arguments>)
Load an extended version of IEEE 33-bus network.
Source: <https://ieeexplore.ieee.org/abstract/document/9258930>
Extensions:
- time series (672 hours, 4 scenarios) for loads and RES generators.
# Arguments
- `oltc::Bool=true`: whether to add an OLTC with ±10% voltage regulation to the transformer.
- `scale_gen::Real = 1.0`: scale factor of all generators.
- `scale_load::Real = 1.0`: scale factor of loads.
- `number_of_hours::Int = 672`: number of hourly optimization periods.
- `number_of_scenarios::Int = 4`: number of scenarios (different time series for loads and
RES generators).
- `number_of_years::Int = 3`: number of years (different investment sets).
- `energy_cost::Real = 50.0`: cost of energy exchanged with transmission network [€/MWh].
- `cost_scale_factor::Real = 1.0`: scale factor for all costs.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
"""
function load_ieee_33(;
oltc::Bool = true,
scale_gen::Real = 1.0,
scale_load::Real = 1.0,
number_of_hours::Int = 672,
number_of_scenarios::Int = 4,
number_of_years::Int = 3,
energy_cost::Real = 50.0, # €/MWh
cost_scale_factor::Real = 1.0,
share_data::Bool = true,
)
file = normpath(@__DIR__,"..","data","ieee_33","ieee_33_672h_4s_3y.json")
function set_energy_cost!(data)
data["gen"]["1"]["cost"][end-1] = energy_cost # Coupling generator id is 1 because its String id in the JSON file happens to be the first in alphabetical order.
end
return _FP.convert_JSON(
file;
oltc,
scale_gen,
scale_load,
number_of_hours,
number_of_scenarios,
number_of_years,
cost_scale_factor,
sn_data_extensions = [set_energy_cost!],
share_data,
)
end
"""
data, model_type, ref_extensions, solution_processors, setting = load_ieee_33_defaultparams(<keyword arguments>)
Load `ieee_33` in `data` and use default values for the other returned values.
See also: `load_ieee_33`.
"""
function load_ieee_33_defaultparams(; kwargs...)
load_ieee_33(; kwargs...), load_params_defaults_distribution()...
end
## Auxiliary functions
function load_params_defaults_transmission()
model_type = _PM.DCPPowerModel
ref_extensions = Function[_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, _PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!]
solution_processors = Function[_PM.sol_data_model!]
setting = Dict("conv_losses_mp" => false)
return model_type, ref_extensions, solution_processors, setting
end
function load_params_defaults_distribution()
model_type = _FP.BFARadPowerModel
ref_extensions = Function[_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _FP.ref_add_ne_branch_allbranches!, _FP.ref_add_frb_branch!, _FP.ref_add_oltc_branch!]
solution_processors = Function[_PM.sol_data_model!]
setting = Dict{String,Any}()
return model_type, ref_extensions, solution_processors, setting
end
function data_scale_gen(gen_scale_factor)
return data -> (
for gen in values(data["gen"])
gen["pmax"] *= gen_scale_factor
gen["pmin"] *= gen_scale_factor
if haskey(gen, "qmax")
gen["qmax"] *= gen_scale_factor
gen["qmin"] *= gen_scale_factor
end
end
)
end
function data_scale_load(load_scale_factor)
return data -> (
for load in values(data["load"])
load["pd"] *= load_scale_factor
if haskey(load, "qd")
load["qd"] *= load_scale_factor
end
end
)
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 4560 | """
create_multi_year_network_data(case, number_of_hours, number_of_scenarios, number_of_years; <keyword arguments>)
Using the input case (`case6`, `case67`) create multi-year network data.
Dimension hierarchy is:
```
year{...
scenario{...
hour{...}
}
}
```
# Keyword arguments
- `year_scale_factor::Int=10`: how many years a representative year should represent.
- `cost_scale_factor`: scale factor for all costs (default: `1.0`).
- `init_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
target dict after its initialization. They must have exactly one argument (the target
dict) and can modify it; the return value is unused.
- `sn_data_extensions::Vector{<:Function}=Function[]`: functions to be applied to the
single-network dictionaries containing data for each single year, just before
`_FP.make_multinetwork` is called. They must have exactly one argument (the single-network
dict) and can modify it; the return value is unused.
- `share_data::Bool=true`: whether constant data is shared across networks (faster) or
duplicated (uses more memory, but ensures networks are independent; useful if further
transformations will be applied).
- `kwargs...`: other parameters used by specific test cases.
"""
function create_multi_year_network_data(
case::String,
number_of_hours::Int,
number_of_scenarios::Int,
number_of_years::Int;
year_scale_factor::Int = 10,
cost_scale_factor::Real = 1.0,
init_data_extensions::Vector{<:Function} = Function[],
sn_data_extensions::Vector{<:Function} = Function[],
share_data::Bool = true,
kwargs...
)
my_data = Dict{String, Any}("multinetwork"=>true, "name"=>case, "nw"=>Dict{String,Any}(), "per_unit"=>true)
if case == "case6"
base_file = normpath(@__DIR__,"..","..","test","data","case6","case6_")
planning_years = [2030, 2040, 2050]
_FP.add_dimension!(my_data, :hour, number_of_hours)
scenario = Dict(s => Dict{String,Any}("probability"=>1/number_of_scenarios) for s in 1:number_of_scenarios)
_FP.add_dimension!(my_data, :scenario, scenario, metadata = Dict{String,Any}("mc"=>get(kwargs, :mc, true)))
_FP.add_dimension!(my_data, :year, number_of_years; metadata = Dict{String,Any}("scale_factor"=>year_scale_factor))
elseif case == "case67"
base_file = normpath(@__DIR__,"..","..","test","data","case67","case67_tnep_")
planning_years = [2030, 2040, 2050]
_FP.add_dimension!(my_data, :hour, number_of_hours)
data_years = [2017, 2018, 2019]
start = [1483228800000, 1514764800000, 1546300800000]
if number_of_scenarios > 3
error("Only 3 scenarios are supported")
end
scenario = Dict(s => Dict{String,Any}(
"probability"=>1/number_of_scenarios,
"year" => data_years[s],
"start" => start[s]
) for s in 1:number_of_scenarios)
_FP.add_dimension!(my_data, :scenario, scenario)
_FP.add_dimension!(my_data, :year, number_of_years; metadata = Dict{String,Any}("scale_factor"=>year_scale_factor))
else
error("Case \"$(case)\" not (yet) supported.")
end
# Apply init data extensions
for f! in init_data_extensions
f!(my_data)
end
for year_idx = 1 : number_of_years
year = planning_years[year_idx]
file = base_file * "$year" * ".m"
data = _FP.parse_file(file)
data["dim"] = _FP.dim(my_data)
_FP.scale_data!(data; year_idx, cost_scale_factor)
# Apply single network data extensions
for f! in sn_data_extensions
f!(data)
end
add_one_year!(my_data, case, data, year_idx; share_data)
end
return my_data
end
function add_one_year!(my_data, case, data, year_idx; share_data)
number_of_nws = _FP.dim_length(data, :hour) * _FP.dim_length(data, :scenario)
nw_id_offset = number_of_nws * (year_idx - 1)
if case == "case6"
data, loadprofile, genprofile = create_profile_data_italy!(data)
elseif case == "case67"
data, loadprofile, genprofile = create_profile_data_germany!(data)
else
error("Case \"$(case)\" not (yet) supported.")
end
time_series = create_profile_data(number_of_nws, data, loadprofile, genprofile)
mn_data = _FP.make_multinetwork(data, time_series; number_of_nws, nw_id_offset, share_data)
_FP.import_nws!(my_data, mn_data)
return my_data
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 52918 | # Tools for analyzing the solution of a FlexPlan optimization problem
using CSV
using DataFrames
import GR
using Graphs
using GraphRecipes
using Printf
import Random
using StatsPlots
"""
sol_graph(sol, data; <keyword arguments>)
Plot a graph of the network with bus numbers and active power of branches.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `plot::String`: output a plot to `plot` file; file type is based on `plot` extension.
- `kwargs...`: specify dimensions and coordinates for which to generate plots, like
`hour=[12,24]`.
"""
function sol_graph(sol::Dict{String,Any}, data::Dict{String,Any}; plot::String, out_dir::String=pwd(), kwargs...)
for n in _FP.nw_ids(data; kwargs...)
plt = _sol_graph(sol["nw"]["$n"], data["nw"]["$n"])
name, ext = splitext(plot)
for d in reverse(_FP.dim_names(data))
if _FP.dim_length(data, d) > 1
name *= "_$(string(d)[1])$(_FP.coord(data,n,d))"
end
end
savefig(plt, joinpath(out_dir, "$name$ext"))
end
end
function _sol_graph(sol::Dict{String,Any}, data::Dict{String,Any})
# Collect digraph edges (bus pairs) and edge labels (branch active power)
edge_power = Dict{Tuple{Int,Int},Float64}() # Key: edge; value: power.
for (b,sol_br) in sol["branch"]
data_br = data["branch"][b]
f_bus = data_br["f_bus"]
t_bus = data_br["t_bus"]
p = sol_br["pf"]
edge_power[(f_bus,t_bus)] = p
end
for (b,sol_br) in get(sol, "ne_branch", Dict())
if sol_br["built"] > 0.5
data_br = data["ne_branch"][b]
f_bus = data_br["f_bus"]
t_bus = data_br["t_bus"]
curr_p = sol_br["pf"]
prev_p = get(edge_power, (f_bus,t_bus), 0.0)
edge_power[(f_bus,t_bus)] = prev_p + curr_p # Sum power in case a candidate branch is added in parallel to an existing one.
end
end
n_bus = length(data["bus"])
node_names = string.(1:n_bus)
if haskey(sol,"convdc") || haskey(sol,"convdc_ne") || haskey(sol,"branchdc")
n_ac_bus = n_bus
node_names = "AC" .* string.(1:n_bus)
for (c,sol_conv) in get(sol,"convdc",Dict())
data_conv = data["convdc"][c]
ac_bus = data_conv["busac_i"]
dc_bus = data_conv["busdc_i"]
p = sol_conv["pconv"] # Converters are lossy; here we take the AC side power (load convention).
edge_power[(ac_bus,dc_bus+n_ac_bus)] = p
end
for (c,sol_conv) in get(sol,"convdc_ne",Dict())
if sol_conv["isbuilt"] > 0.5
data_conv = data["convdc_ne"][c]
ac_bus = data_conv["busac_i"]
dc_bus = data_conv["busdc_i"]
curr_p = sol_conv["pconv"] # Converters are lossy; here we take the AC side power (load convention).
prev_p = get(edge_power, (ac_bus,dc_bus+n_ac_bus), 0.0)
edge_power[(ac_bus,dc_bus+n_ac_bus)] = prev_p + curr_p # Sum power in case a candidate converter is added in parallel to an existing one.
end
end
for (b,sol_br) in sol["branchdc"]
data_br = data["branchdc"][b]
f_bus = data_br["fbusdc"]
t_bus = data_br["tbusdc"]
p = sol_br["pf"]
edge_power[(f_bus+n_ac_bus,t_bus+n_ac_bus)] = p
end
for (b,sol_br) in get(sol, "branchdc_ne", Dict())
if sol_br["isbuilt"] > 0.5
data_br = data["branchdc_ne"][b]
f_bus = data_br["fbusdc"]
t_bus = data_br["tbusdc"]
curr_p = sol_br["pf"]
prev_p = get(edge_power, (f_bus+n_ac_bus,t_bus+n_ac_bus), 0.0)
edge_power[(f_bus+n_ac_bus,t_bus+n_ac_bus)] = prev_p + curr_p # Sum power in case a candidate branch is added in parallel to an existing one.
end
end
n_bus = maximum([max(i,j) for (i,j) in keys(edge_power)])
n_dc_bus = n_bus - n_ac_bus
append!(node_names, "DC" .* string.(1:n_dc_bus))
end
# Orient digraph edges according to power flow
for (s,d) in collect(keys(edge_power)) # collect is needed because we want to mutate edge_power.
if edge_power[(s,d)] < 0
edge_power[(d,s)] = -edge_power[(s,d)]
delete!(edge_power, (s,d))
end
end
# Generate digraph
g = SimpleDiGraph(n_bus)
for (s,d) in keys(edge_power)
add_edge!(g, s, d)
end
# Generate plot
edge_power_rounded = Dict(e => round(p;sigdigits=2) for (e,p) in edge_power) # Shorten edge labels preserving only most useful information.
node_weights = 1 ./ length.(node_names)
function calc_node_weight(name) # Hack to make all nodes the same size.
len = length(name)
len == 1 ? 10.0 : 1/len
end
Random.seed!(1) # To get reproducible results. Keep until GraphRecipes allows to pass seed as an argument to NetworkLayout functions.
GR.setarrowsize(10/n_bus) # Keep until Plots implements arrow size.
plt = graphplot(g;
size = (300*sqrt(n_bus),300*sqrt(n_bus)),
method = :stress,
names = node_names,
nodeshape = :circle,
nodesize = 0.15,
node_weights = calc_node_weight.(node_names),
nodecolor = HSL(0,0,1),
nodestrokecolor = HSL(0,0,0.5),
linewidth = 2,
edgelabel = edge_power_rounded,
curves = false,
curvature_scalar = 0.0,
arrow = :filled,
)
return plt
end
"""
sol_report_cost_summary(sol, data; <keyword arguments>)
Report the objective cost by network component and cost category.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `td_coupling::Bool=true`: whether to include cost of energy exchanged with other networks.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_cost_summary(sol::Dict{String,Any}, data::Dict{String,Any}; td_coupling::Bool=true, out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
sol_nw = sol["nw"]
data_nw = data["nw"]
function sum_investment_cost(single_nw_cost::Function)
sum(single_nw_cost(data_nw[n], sol_nw[n]) for n in string.(_FP.nw_ids(dim; hour=1, scenario=1)))
end
function sum_operation_cost(single_nw_cost::Function)
sum(scenario["probability"] * sum(single_nw_cost(data_nw[n], sol_nw[n], n) for n in string.(_FP.nw_ids(dim; scenario=s))) for (s, scenario) in _FP.dim_prop(dim, :scenario))
end
df = DataFrame(component=String[], inv=Float64[], op=Float64[], shift=Float64[], red=Float64[], curt=Float64[])
inv = sum_investment_cost((d,s) -> sum(d["ne_branch"][i]["construction_cost"] for (i,branch) in get(s,"ne_branch",Dict()) if branch["investment"]>0.5; init=0.0))
push!(df, ("AC branches", inv, 0.0, 0.0, 0.0, 0.0))
inv = sum_investment_cost((d,s) -> sum(d["convdc_ne"][i]["cost"] for (i,conv) in get(s,"convdc_ne",Dict()) if conv["investment"]>0.5; init=0.0))
push!(df, ("converters", inv, 0.0, 0.0, 0.0, 0.0))
inv = sum_investment_cost((d,s) -> sum(d["branchdc_ne"][i]["cost"] for (i,branch) in get(s,"branchdc_ne",Dict()) if branch["investment"]>0.5; init=0.0))
push!(df, ("DC branches", inv, 0.0, 0.0, 0.0, 0.0))
inv = sum_investment_cost((d,s) -> sum(d["load"][i]["cost_inv"] for (i,load) in s["load"] if get(load,"investment",0.0)>0.5; init=0.0))
shift = sum_operation_cost((d,s,n) -> sum(get(d["load"][i],"cost_shift",0.0) * 0.5*(load["pshift_up"]+load["pshift_down"]) for (i,load) in s["load"]; init=0.0))
red = sum_operation_cost((d,s,n) -> sum(get(d["load"][i],"cost_red",0.0) * load["pred"] for (i,load) in s["load"]; init=0.0))
curt = sum_operation_cost((d,s,n) -> sum(d["load"][i]["cost_curt"] * load["pcurt"] for (i,load) in s["load"]; init=0.0))
push!(df, ("loads", inv, 0.0, shift, red, curt))
inv = sum_investment_cost((d,s) -> sum(d["ne_storage"][i]["eq_cost"]+d["ne_storage"][i]["inst_cost"] for (i,storage) in get(s,"ne_storage",Dict()) if storage["investment"]>0.5; init=0.0))
push!(df, ("storage", inv, 0.0, 0.0, 0.0, 0.0))
op = sum_operation_cost((d,s,n) -> sum((length(d["gen"][i]["cost"])≥2 ? d["gen"][i]["cost"][end-1] : 0.0) * gen["pg"] for (i,gen) in s["gen"]; init=0.0))
curt = sum_operation_cost((d,s,n) -> sum(get(d["gen"][i],"cost_curt",0.0) * gen["pgcurt"] for (i,gen) in s["gen"]; init=0.0))
push!(df, ("generators", 0.0, op, 0.0, 0.0, curt))
if td_coupling && _FP.has_dim(data, :sub_nw)
op = sum_operation_cost((d,s,n) ->
begin
d_gen = string(_FP.dim_prop(dim,:sub_nw,_FP.coord(dim,parse(Int,n),:sub_nw),"d_gen"))
cost_vector = d["gen"][d_gen]["cost"]
(length(cost_vector)≥2 ? cost_vector[end-1] : 0.0) * s["td_coupling"]["p"]
end
)
push!(df, ("T-D coupling", 0.0, op, 0.0, 0.0, 0.0))
end
if !isempty(plot)
total_cost = sum(sum(df[:,col]) for col in 2:ncol(df))
horizon = _FP.dim_meta(dim, :year, "scale_factor")
total_cost_string = @sprintf("total: %g over %i ", total_cost, horizon) * (horizon==1 ? "year" : "years")
plt = @df df groupedbar(:component, [:curt :red :shift :op :inv];
bar_position = :stack,
plot_title = "Cost",
plot_titlevspan = 0.07,
title = total_cost_string,
titlefontsize = 8,
legend_position = :topleft,
xguide = "Network components",
xtickfontsize = nrow(df) ≤ 6 ? 8 : 7,
framestyle = :zerolines,
xgrid = :none,
linecolor = HSLA(0,0,1,0),
label = ["curtailment" "voluntary reduction" "time shifting" "normal operation" "investment"],
seriescolor = [HSLA(0,1,0.5,0.75) HSLA(0,0.67,0.5,0.75) HSLA(0,0.33,0.5,0.75) HSLA(0,0,0.5,0.75) HSLA(210,0.75,0.5,0.75)],
)
savefig(plt, joinpath(out_dir,plot))
end
# Add row of totals
df2 = combine(df, Not(:component) .=> sum; renamecols=false)
df2[!,:component] .= "total"
df = vcat(df,df2)
# Add column of totals
transform!(df, Not(:component) => ByRow(+) => :total)
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
return df
end
"""
sol_report_investment(sol, data; <keyword arguments>)
Report investment decisions made in `sol`.
Return a DataFrame; optionally write a CSV table.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
"""
function sol_report_investment(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(component=String[], id=Int[], source_id=String[], year=Int[], investment=Bool[])
for n in _FP.nw_ids(dim; hour=1, scenario=1)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
y = _FP.coord(dim, n, :year)
for (b,br_sol) in get(sol_nw,"ne_branch",Dict())
br_data = data_nw["ne_branch"][b]
push!(df, ("ne_branch", parse(Int,b), string(last(br_data["source_id"])), y, br_sol["investment"]>0.5))
end
for (c,conv_sol) in get(sol_nw,"convdc_ne",Dict())
conv_data = data_nw["convdc_ne"][c]
push!(df, ("convdc_ne", parse(Int,c), string(last(conv_data["source_id"])), y, conv_sol["investment"]>0.5))
end
for (b,br_sol) in get(sol_nw,"branchdc_ne",Dict())
br_data = data_nw["branchdc_ne"][b]
push!(df, ("branchdc_ne", parse(Int,b), string(last(br_data["source_id"])), y, br_sol["investment"]>0.5))
end
for (s,storage_sol) in get(sol_nw,"ne_storage",Dict())
storage_data = data_nw["ne_storage"][s]
push!(df, ("ne_storage", parse(Int,s), string(last(storage_data["source_id"])), y, storage_sol["investment"]>0.5))
end
for (l,load_sol) in get(sol_nw,"load",Dict())
load_data = data_nw["load"][l]
if load_data["flex"] > 0.5 # Loads that can be made flexible
push!(df, ("load", parse(Int,l), string(last(load_data["source_id"])), y, load_sol["investment"]>0.5))
end
end
end
sort!(df, [:component, :id, :year])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
return df
end
"""
sol_report_investment_summary(sol, data; <keyword arguments>)
Report a summary of investments made in `sol`.
Categorize network component in: _existing_, _activated candidates_, and _not activated
candidates_.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_investment_summary(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(component=String[], year=Int[], existing=Int[], candidate_on=Int[], candidate_off=Int[])
for n in _FP.nw_ids(dim; hour=1, scenario=1)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
y = _FP.coord(dim, n, :year)
existing = length(sol_nw["branch"])
candidate = length(get(sol_nw,"ne_branch",Dict()))
candidate_on = round(Int, sum(br["built"] for br in values(get(sol_nw,"ne_branch",Dict())); init=0.0))
candidate_off = candidate - candidate_on
push!(df, ("AC branches", y, existing, candidate_on, candidate_off))
existing = length(get(sol_nw,"convdc",Dict()))
candidate = length(get(sol_nw,"convdc_ne",Dict()))
candidate_on = round(Int, sum(st["isbuilt"] for st in values(get(sol_nw,"convdc_ne",Dict())); init=0.0))
candidate_off = candidate - candidate_on
push!(df, ("AC/DC converters", y, existing, candidate_on, candidate_off))
existing = length(get(sol_nw,"branchdc",Dict()))
candidate = length(get(sol_nw,"branchdc_ne",Dict()))
candidate_on = round(Int, sum(br["isbuilt"] for br in values(get(sol_nw,"branchdc_ne",Dict())); init=0.0))
candidate_off = candidate - candidate_on
push!(df, ("DC branches", y, existing, candidate_on, candidate_off))
existing = length(sol_nw["gen"])
push!(df, ("generators", y, existing, 0, 0))
existing = length(get(sol_nw,"storage",Dict()))
candidate = length(get(sol_nw,"ne_storage",Dict()))
candidate_on = round(Int, sum(st["isbuilt"] for st in values(get(sol_nw,"ne_storage",Dict())); init=0.0))
candidate_off = candidate - candidate_on
push!(df, ("storage devices", y, existing, candidate_on, candidate_off))
candidate = round(Int, sum(load["flex"] for load in values(data_nw["load"])))
existing = length(sol_nw["load"]) - candidate
candidate_on = round(Int, sum(load["flex"] for load in values(sol_nw["load"])))
candidate_off = candidate - candidate_on
push!(df, ("loads", y, existing, candidate_on, candidate_off))
end
sort!(df, [:component, :year])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
rdf = reverse(df)
rdf.name = maximum(rdf.year) == 1 ? rdf.component : rdf.component .* " - y" .* string.(rdf.year)
plt = @df rdf groupedbar([:candidate_off :candidate_on :existing];
bar_position = :stack,
orientation = :h,
plot_title = "Investments",
yguide = "Network components",
xguide = "Count",
framestyle = :grid,
yticks = (1:nrow(rdf), :name),
ygrid = :none,
linecolor = HSLA(0,0,1,0),
label = ["not activated candidates" "activated candidates" "existing"],
legend_position = :bottomright,
seriescolor = [HSLA(0,0,0.75,0.75) HSLA(210,0.75,0.67,0.75) HSLA(210,1,0.33,0.75)],
)
vline!(plt, [0]; seriescolor=HSL(0,0,0), label=:none)
savefig(plt, joinpath(out_dir,plot))
end
return df
end
"""
sol_report_power_summary(sol, data; <keyword arguments>)
Report the absorbed/injected active power by component type, using load convention.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `td_coupling::Bool=false`: whether to include power exchanged with other networks.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_power_summary(sol::Dict{String,Any}, data::Dict{String,Any}; td_coupling::Bool=false, out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
sol_nw = sol["nw"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], load=Float64[], storage_abs=Float64[], storage_inj=Float64[], gen=Float64[])
for n in _FP.nw_ids(dim)
nw = sol_nw["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
load = sum(load["pflex"] for load in values(nw["load"]))
storage_abs = sum(storage["sc"] for storage in values(get(nw,"storage",Dict())); init=0.0) + sum(storage["sc_ne"] for storage in values(get(nw,"ne_storage",Dict())); init=0.0)
storage_inj = -sum(storage["sd"] for storage in values(get(nw,"storage",Dict())); init=0.0) - sum(storage["sd_ne"] for storage in values(get(nw,"ne_storage",Dict())); init=0.0)
gen = -sum(gen["pg"] for gen in values(nw["gen"]); init=0.0)
push!(df, (h, s, y, load, storage_abs, storage_inj, gen))
end
if td_coupling
df.td_coupling = [sol_nw["$n"]["td_coupling"]["p"] for n in _FP.nw_ids(dim)]
end
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = gd[k]
plt = @df sdf groupedbar([:load :storage_abs :storage_inj :gen],
bar_position = :stack,
plot_title = "Aggregated power",
plot_titlevspan = 0.07,
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Absorbed power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
bar_width = 1,
linecolor = HSLA(0,0,1,0),
legend_position = :bottomright,
label = ["demand" "storage absorption" "storage injection" "generation"],
seriescolor = [HSLA(210,1,0.67,0.75) HSLA(210,1,0.33,0.75) HSLA(0,0.75,0.33,0.75) HSLA(0,0.75,0.67,0.75)],
)
if :td_coupling in propertynames(sdf)
@df sdf plot!(plt, :td_coupling; label="T&D coupling", seriestype=:stepmid, linecolor=:black)
end
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_branch(sol, data; <keyword arguments>)
Report the active, reactive, and relative active power of branches.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
- `rated_power_scale_factor::Float64=1.0`: scale the rated power further.
"""
function sol_report_branch(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="", rated_power_scale_factor::Float64=1.0)
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], component=String[], id=Int[], source_id=String[], p=Float64[], q=Float64[], p_rel=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for comp in ("branch", "ne_branch")
for (b, br) in get(sol_nw, comp, Dict{String,Any}())
data_br = data_nw[comp][b]
source_id = string(data_br["source_id"][end])
rate = data_br["rate_a"]
p = br["pf"]
q = br["qf"]
p_rel = abs(p) / (rated_power_scale_factor * rate)
push!(df, (h, s, y, comp, parse(Int,b), source_id, p, q, p_rel))
end
end
end
sort!(df, [:year, :scenario, :component, :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = select(gd[k], :hour, [:component,:id] => ByRow((c,i)->"$(c)_$i") => :comp_id, :p_rel)
sdf = unstack(sdf, :comp_id, :p_rel)
sort!(sdf, :hour)
few_branches = ncol(sdf) ≤ 24
plt = @df sdf Plots.plot(:hour, cols(2:ncol(sdf)),
plot_title = "Relative active power of branches",
plot_titlevspan = 0.07,
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Relative active power",
ylims = (0,1),
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_branches ? :outertopright : :none,
seriestype = :stepmid,
fillrange = 0.0,
fillalpha = 0.05,
seriescolor = few_branches ? :auto : HSL(210,0.75,0.5),
linewidth = few_branches ? 1.0 : 0.5,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_bus_voltage_angle(sol, data; <keyword arguments>)
Report bus voltage angle.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_bus_voltage_angle(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], id=Int[], source_id=String[], va=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (i,bus) in sol_nw["bus"]
source_id = string(data_nw["bus"][i]["source_id"][end])
push!(df, (h, s, y, parse(Int,i), source_id, bus["va"]))
end
end
sort!(df, [:year, :scenario, :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = select(gd[k], :hour, :id, :va)
few_buses = length(unique(sdf.id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Voltage angle [rad]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_buses ? :outertopright : :none,
)
gsd = groupby(sdf, :id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :va)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :va;
seriestype = :stepmid,
fillrange = 0.0,
fillalpha = 0.05,
linewidth = few_buses ? 1.0 : 0.5,
seriescolor = few_buses ? :auto : HSL(0,0,0),
label = "bus_$(i.id)",
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Buses",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_bus_voltage_magnitude(sol, data; <keyword arguments>)
Report bus voltage magnitude.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_bus_voltage_magnitude(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], id=Int[], source_id=String[], vm=Float64[], vmin=Float64[], vmax=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (i,bus) in sol_nw["bus"]
source_id = string(data_nw["bus"][i]["source_id"][end])
vmin = data_nw["bus"][i]["vmin"]
vmax = data_nw["bus"][i]["vmax"]
push!(df, (h, s, y, parse(Int,i), source_id, bus["vm"], vmin, vmax))
end
end
sort!(df, [:year, :scenario, :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
keep_similar(x,y) = isapprox(x,y;atol=0.001) ? y : NaN
sdf = select(gd[k], :hour, :id, :vm, [:vm,:vmin] => ByRow(keep_similar) => :dn, [:vm,:vmax] => ByRow(keep_similar) => :up)
few_buses = length(unique(sdf.id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Voltage magnitude [p.u.]",
xguide = "Time [periods]",
framestyle = :grid,
legend_position = few_buses ? :outertopright : :none,
)
hline!(plt, [1]; seriescolor=HSL(0,0,0), label=:none)
gsd = groupby(sdf, :id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :vm, :dn, :up)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :vm;
seriestype = :stepmid,
fillrange = 1.0,
fillalpha = 0.05,
linewidth = few_buses ? 1.0 : 0.5,
seriescolor = few_buses ? :auto : HSL(0,0,0),
label = "bus_$(i.id)",
)
@df ssdf plot!(plt, [:dn :up];
seriestype = :stepmid,
seriescolor = [HSL(0,0.75,0.5) HSL(210,1,0.5)],
label = :none,
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Buses",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_gen(sol, data; <keyword arguments>)
Report the active power of generators along with their minimum and maximum active power.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_gen(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], id=Int[], source_id=String[], p=Float64[], pmin=Float64[], pmax=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (g, gen) in sol_nw["gen"]
data_gen = data_nw["gen"][g]
source_id = haskey(data_gen,"source_id") ? string(data_gen["source_id"][end]) : ""
p = gen["pg"]
pmin = data_gen["pmin"]
pmax = data_gen["pmax"]
push!(df, (h, s, y, parse(Int,g), source_id, p, pmin, pmax))
end
end
sort!(df, [:year, :scenario, :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = select(gd[k], :hour, :id, :p, [:pmax,:p] => ByRow(-) => :ribbon_up, [:p,:pmin] => ByRow(-) => :ribbon_dn)
few_generators = length(unique(sdf.id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Injected active power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_generators ? :outertopright : :none,
)
gsd = groupby(sdf, :id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :p, :ribbon_up, :ribbon_dn)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :hour, :p;
seriestype = :stepmid,
fillalpha = 0.05,
ribbon = (:ribbon_dn, :ribbon_up),
seriescolor = few_generators ? :auto : HSL(210,0.75,0.5),
linewidth = few_generators ? 1.0 : 0.5,
label = "gen_$(i.id)",
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Generators",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_load(sol, data; <keyword arguments>)
Report load variables.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_load(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], id=Int[], source_id=String[], flex=Bool[], pd=Float64[], pflex=Float64[], pshift_up=Float64[], pshift_down=Float64[], pred=Float64[], pcurt=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (i,load) in sol_nw["load"]
data_load = data_nw["load"][i]
source_id = string(data_load["source_id"][end])
flex = Bool(round(Int,get(load,"flex",data_load["flex"])))
pd = data_nw["load"][i]["pd"]
push!(df, (h, s, y, parse(Int,i), source_id, flex, pd, load["pflex"], load["pshift_up"], load["pshift_down"], load["pred"], load["pcurt"]))
end
end
sort!(df, [:year, :scenario, :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = select(gd[k], :hour, :id, :pflex, [:pd,:pflex] => ByRow(min) => :up, [:pd,:pflex] => ByRow(max) => :dn)
few_loads = length(unique(sdf.id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Absorbed active power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_loads ? :outertopright : :none,
)
gsd = groupby(sdf, :id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :pflex, :up, :dn)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :pflex;
seriestype = :stepmid,
fillcolor = HSLA(210,1,0.5,0.1),
fillrange = :up,
seriescolor = HSLA(0,0,0,0),
linewidth = 0.0,
label = :none,
)
@df ssdf plot!(plt, :pflex;
seriestype = :stepmid,
fillcolor = HSLA(0,0.75,0.5,0.1),
fillrange = :dn,
seriescolor = HSLA(0,0,0,0),
linewidth = 0.0,
label = :none,
)
@df ssdf plot!(plt, :pflex;
seriestype = :stepmid,
seriescolor = few_loads ? :auto : HSL(0,0,0),
linewidth = 0.5,
label = "load_$(i.id)",
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Loads",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_load_summary(sol, data; <keyword arguments>)
Report aggregated load variables.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_load_summary(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], pd=Float64[], pflex=Float64[], pshift_up=Float64[], pshift_down=Float64[], pred=Float64[], pcurt=Float64[])
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
pd = sum(load["pd"] for load in values(data_nw["load"]))
pflex = sum(load["pflex"] for load in values(sol_nw["load"]))
pshift_up = sum(load["pshift_up"] for load in values(sol_nw["load"]))
pshift_down = sum(load["pshift_down"] for load in values(sol_nw["load"]))
pred = sum(load["pred"] for load in values(sol_nw["load"]))
pcurt = sum(load["pcurt"] for load in values(sol_nw["load"]))
push!(df, (h, s, y, pd, pflex, pshift_up, pshift_down, pred, pcurt))
end
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = gd[k]
plt = @df sdf groupedbar([:pshift_up :pshift_down :pred :pcurt :pflex-:pshift_up],
bar_position = :stack,
plot_title = "Aggregated load",
plot_titlevspan = 0.07,
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Absorbed power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
bar_width = 1,
linecolor = HSLA(0,0,1,0),
legend_position = :bottomright,
label = ["shift up" "shift down" "voluntary reduction" "curtailment" :none],
seriescolor = [HSLA(210,1,0.67,0.75) HSLA(0,1,0.75,0.75) HSLA(0,0.67,0.5,0.75) HSLA(0,1,0.25,0.75) HSLA(0,0,0,0.1)],
)
@df sdf plot!(plt, :pd; label="reference demand", seriestype=:stepmid, linecolor=:black, linestyle=:dot)
@df sdf plot!(plt, :pflex; label="absorbed power", seriestype=:stepmid, linecolor=:black)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_storage(sol, data; <keyword arguments>)
Report energy and power of each storage.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_storage(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], component=String[], id=Int[], source_id=String[], energy=Float64[], energy_rating=Float64[], power=Float64[], power_min=Float64[], power_max=Float64[])
# Read from `data` the initial energy of the first period, indexing it as hour 0.
for n in _FP.nw_ids(dim; hour=1)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (i, st) in get(sol_nw, "storage", Dict{String,Any}())
data_st = data_nw["storage"][i]
source_id = string(data_st["source_id"][end])
push!(df, (0, s, y, "storage", parse(Int,i), source_id, data_st["energy"], data_st["energy_rating"], NaN, NaN, NaN))
end
for (i, st) in get(sol_nw, "ne_storage", Dict{String,Any}())
built = st["isbuilt"] > 0.5
data_st = data_nw["ne_storage"][i]
source_id = string(data_st["source_id"][end])
push!(df, (0, s, y, "ne_storage", parse(Int,i), source_id, data_st["energy"]*built, data_st["energy_rating"]*built, NaN, NaN, NaN))
end
end
# Read from `sol` power and final energy of each period.
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
for (i, st) in get(sol_nw, "storage", Dict{String,Any}())
data_st = data_nw["storage"][i]
source_id = string(data_st["source_id"][end])
push!(df, (h, s, y, "storage", parse(Int,i), source_id, st["se"], data_st["energy_rating"], st["ps"], -data_st["discharge_rating"], data_st["charge_rating"]))
end
for (i, st) in get(sol_nw, "ne_storage", Dict{String,Any}())
built = st["isbuilt"] > 0.5
data_st = data_nw["ne_storage"][i]
source_id = string(data_st["source_id"][end])
push!(df, (h, s, y, "ne_storage", parse(Int,i), source_id, st["se_ne"]*built, data_st["energy_rating"]*built, st["ps_ne"], -data_st["discharge_rating"], data_st["charge_rating"]))
end
end
sort!(df, [:year, :scenario, order(:component, rev=true), :id, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
# Energy plot
sdf = select(gd[k], :hour, [:component,:id] => ByRow((c,i)->"$(c)_$i") => :comp_id, :energy, [:energy_rating,:energy] => ByRow(-) => :ribbon_up, :energy => :ribbon_dn)
few_storage = length(unique(sdf.comp_id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Stored energy [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_storage ? :outertopright : :none,
)
gsd = groupby(sdf, :comp_id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :energy, :ribbon_up, :ribbon_dn)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :hour, :energy;
fillalpha = 0.05,
ribbon = (:ribbon_dn, :ribbon_up),
seriescolor = few_storage ? :auto : HSL(210,0.75,0.5),
linewidth = few_storage ? 1.0 : 0.5,
label = "$(i.comp_id)",
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Storage",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_energy_y$(k.year)_s$(k.scenario)$ext"))
# Power plot
sdf = select(gd[k], :hour, [:component,:id] => ByRow((c,i)->"$(c)_$i") => :comp_id, :power, [:power_max,:power] => ByRow(-) => :ribbon_up, [:power,:power_min] => ByRow(-) => :ribbon_dn)
few_storage = length(unique(sdf.comp_id)) ≤ 24
plt = Plots.plot(
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Absorbed power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = few_storage ? :outertopright : :none,
)
gsd = groupby(sdf, :comp_id)
for i in keys(gsd)
ssdf = select(gsd[i], :hour, :power, :ribbon_up, :ribbon_dn)
sort!(ssdf, :hour)
@df ssdf plot!(plt, :hour, :power;
fillalpha = 0.05,
ribbon = (:ribbon_dn, :ribbon_up),
seriestype = :stepmid,
seriescolor = few_storage ? :auto : HSL(210,0.75,0.5),
linewidth = few_storage ? 1.0 : 0.5,
label = "$(i.comp_id)",
)
end
# Needed to add below data after the for loop because otherwise an unwanted second axes frame is rendered under the plot_title.
plot!(plt,
plot_title = "Storage",
plot_titlevspan = 0.07,
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_power_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
"""
sol_report_storage_summary(sol, data; <keyword arguments>)
Report aggregated energy and power of connected storage.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol::Dict{String,Any}`: the solution Dict contained in the result Dict of a FlexPlan
optimization problem.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same FlexPlan
optimization problem.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_storage_summary(sol::Dict{String,Any}, data::Dict{String,Any}; out_dir::String=pwd(), table::String="", plot::String="")
_FP.require_dim(data, :hour, :scenario, :year)
dim = data["dim"]
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], energy=Float64[], energy_rating=Float64[], power=Float64[], power_min=Float64[], power_max=Float64[])
# Read from `data` the initial energy of the first period, indexing it as hour 0.
for n in _FP.nw_ids(dim; hour=1)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
energy = sum(st["energy"] for st in values(get(data_nw,"storage",Dict())) if st["status"]>0; init=0.0) + sum(data_nw["ne_storage"][s]["energy"] for (s,st) in get(sol_nw,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
energy_rating = sum(st["energy_rating"] for st in values(get(data_nw,"storage",Dict())) if st["status"]>0; init=0.0) + sum(data_nw["ne_storage"][s]["energy_rating"] for (s,st) in get(sol_nw,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
push!(df, (0, s, y, energy, energy_rating, NaN, NaN, NaN))
end
# Read from `sol` power and final energy of each period.
for n in _FP.nw_ids(dim)
sol_nw = sol["nw"]["$n"]
data_nw = data["nw"]["$n"]
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
energy = sum(st["se"] for (s,st) in get(sol_nw,"storage",Dict()) if data_nw["storage"][s]["status"]>0; init=0.0) + sum(st["se_ne"] for st in values(get(sol_nw,"ne_storage",Dict())) if st["isbuilt"]>0.5; init=0.0)
energy_rating = sum(st["energy_rating"] for st in values(get(data_nw,"storage",Dict())) if st["status"]>0; init=0.0) + sum(data_nw["ne_storage"][s]["energy_rating"] for (s,st) in get(sol_nw,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
power = sum(st["ps"] for (s,st) in get(sol_nw,"storage",Dict()) if data_nw["storage"][s]["status"]>0; init=0.0) + sum(st["ps_ne"] for st in values(get(sol_nw,"ne_storage",Dict())) if st["isbuilt"]>0.5; init=0.0)
power_min = -sum(st["discharge_rating"] for (s,st) in get(data_nw,"storage",Dict()) if st["status"]>0; init=0.0) - sum(data_nw["ne_storage"][s]["discharge_rating"] for (s,st) in get(sol_nw,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
power_max = sum(st["charge_rating"] for (s,st) in get(data_nw,"storage",Dict()) if st["status"]>0; init=0.0) + sum(data_nw["ne_storage"][s]["charge_rating"] for (s,st) in get(sol_nw,"ne_storage",Dict()) if st["isbuilt"]>0.5; init=0.0)
push!(df, (h, s, y, energy, energy_rating, power, power_min, power_max))
end
sort!(df, [:year, :scenario, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
# Energy plot
sdf = select(gd[k], :hour, :energy, [:energy_rating,:energy] => ByRow(-) => :ribbon_up, :energy => :ribbon_dn)
sort!(sdf, :hour)
plt = @df sdf Plots.plot(:hour, :energy;
plot_title = "Aggregated storage",
plot_titlevspan = 0.07,
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Stored energy [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = :none,
fillalpha = 0.1,
ribbon = (:ribbon_dn, :ribbon_up),
seriescolor = HSL(210,0.75,0.5),
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_energy_y$(k.year)_s$(k.scenario)$ext"))
# Power plot
sdf = select(gd[k], :hour, :power, [:power_max,:power] => ByRow(-) => :ribbon_up, [:power,:power_min] => ByRow(-) => :ribbon_dn)
sort!(sdf, :hour)
plt = @df sdf Plots.plot(:hour, :power;
plot_title = "Aggregated storage",
plot_titlevspan = 0.07,
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Absorbed power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = :none,
seriestype = :stepmid,
fillalpha = 0.1,
ribbon = (:ribbon_dn, :ribbon_up),
seriescolor = HSL(210,0.75,0.5),
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_power_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 5976 | # Data analysis and plotting related to decoupling of transmission and distribution
using Printf
using DataFrames
using StatsPlots
"""
sol_report_decoupling_pcc_power(sol_up, sol_base, sol_down, data, surrogate; <keyword arguments>)
Report the imported active power at PCC.
Return a DataFrame; optionally write a CSV table and a plot.
# Arguments
- `sol_up::Dict{String,Any}`: the solution Dict of the "up" case.
- `sol_base::Dict{String,Any}`: the solution Dict of the "base" case.
- `sol_down::Dict{String,Any}`: the solution Dict of the "down" case.
- `data::Dict{String,Any}`: the multinetwork data Dict used for the same optimization.
- `surrogate::Dict{String,Any}`: the surrogate model Dict, computed with `standalone=true`
argument.
- `model_type::Type`: type of the model to instantiate.
- `optimizer`: the solver to use.
- `build_method::Function`: the function defining the optimization problem to solve.
- `ref_extensions::Vector{<:Function}=Function[]`: functions to apply during model
instantiation.
- `solution_processors::Vector{<:Function}=Function[]`: functions to apply to results.
- `setting::Dict{String,Any}=Dict{String,Any}()`: to be passed to
`_FP.TDDecoupling.run_td_decoupling_model`.
- `out_dir::String=pwd()`: directory for output files.
- `table::String=""`: if not empty, output a CSV table to `table` file.
- `plot::String=""`: if not empty, output a plot to `plot` file; file type is based on
`plot` extension.
"""
function sol_report_decoupling_pcc_power(
sol_up::Dict{String,Any},
sol_base::Dict{String,Any},
sol_down::Dict{String,Any},
data::Dict{String,Any},
surrogate::Dict{String,Any};
model_type::Type,
optimizer,
build_method::Function,
ref_extensions::Vector{<:Function} = Function[],
solution_processors::Vector{<:Function} = Function[],
setting::Dict{String,Any}=Dict{String,Any}(),
out_dir::String=pwd(),
table::String="",
plot::String=""
)
_FP.require_dim(data, :hour, :scenario, :year, :sub_nw)
data = deepcopy(data)
dim = data["dim"]
_FP.TDDecoupling.add_ne_branch_indicator!(data, sol_base)
_FP.TDDecoupling.add_ne_storage_indicator!(data, sol_base)
_FP.TDDecoupling.add_flex_load_indicator!(data, sol_base)
sol_up_full = _FP.TDDecoupling.run_td_decoupling_model(data; model_type, optimizer, build_method=_FP.TDDecoupling.build_max_import_with_current_investments(build_method), ref_extensions, solution_processors, setting)
sol_down_full = _FP.TDDecoupling.run_td_decoupling_model(data; model_type, optimizer, build_method=_FP.TDDecoupling.build_max_export_with_current_investments(build_method), ref_extensions, solution_processors, setting)
sol_surrogate_up = _FP.TDDecoupling.run_td_decoupling_model(surrogate; model_type, optimizer, build_method=_FP.TDDecoupling.build_max_import(build_method), ref_extensions, solution_processors, setting)
sol_surrogate_base = _FP.TDDecoupling.run_td_decoupling_model(surrogate; model_type, optimizer, build_method, ref_extensions, solution_processors, setting)
sol_surrogate_down = _FP.TDDecoupling.run_td_decoupling_model(surrogate; model_type, optimizer, build_method=_FP.TDDecoupling.build_max_export(build_method), ref_extensions, solution_processors, setting)
df = DataFrame(hour=Int[], scenario=Int[], year=Int[], p_up=Float64[], p_up_monotonic=Float64[], p_base=Float64[], p_down_monotonic=Float64[], p_down=Float64[], surr_up=Float64[], surr_base=Float64[], surr_down=Float64[])
for n in _FP.nw_ids(dim)
h = _FP.coord(dim, n, :hour)
s = _FP.coord(dim, n, :scenario)
y = _FP.coord(dim, n, :year)
push!(df, (h, s, y, sol_up_full["nw"]["$n"]["td_coupling"]["p"], sol_up["nw"]["$n"]["td_coupling"]["p"], sol_base["nw"]["$n"]["td_coupling"]["p"], sol_down["nw"]["$n"]["td_coupling"]["p"], sol_down_full["nw"]["$n"]["td_coupling"]["p"], sol_surrogate_up["nw"]["$n"]["td_coupling"]["p"], sol_surrogate_base["nw"]["$n"]["td_coupling"]["p"], sol_surrogate_down["nw"]["$n"]["td_coupling"]["p"]))
end
sort!(df, [:year, :scenario, :hour])
if !isempty(table)
CSV.write(joinpath(out_dir,table), df)
end
if !isempty(plot)
gd = groupby(df, [:scenario, :year])
for k in keys(gd)
sdf = select(gd[k], :hour, Not([:scenario, :year]))
sort!(sdf, :hour)
select!(sdf, Not(:hour))
plt = @df sdf Plots.plot([:surr_up :surr_down],
title = "scenario $(k.scenario), year $(k.year)",
titlefontsize = 8,
yguide = "Imported power [p.u.]",
xguide = "Time [periods]",
framestyle = :zerolines,
legend_position = :right,
legend_title = "Flexibility",
legend_title_font_pointsize = 7,
legend_font_pointsize = 6,
label = ["surrogate up" "surrogate down"],
seriestype = :stepmid,
linewidth = 0.0,
fillrange = :surr_base,
fillalpha = 0.2,
seriescolor = [HSLA(210,1,0.5,0) HSLA(0,0.75,0.5,0)],
fillcolor = [HSL(210,1,0.5) HSL(0,0.75,0.5)],
)
@df sdf Plots.plot!(plt, [:p_up_monotonic :p_up :p_down_monotonic :p_down :p_base ],
plot_title = "Power exchange at PCC",
plot_titlevspan = 0.07,
label = ["dist up monotonic" "dist up full" "dist down monotonic" "dist down full" "optimal planning"],
seriestype = :stepmid,
linestyle = [:dot :solid :dot :solid :solid],
seriescolor = [HSL(210,1,0.5) HSL(210,1,0.5) HSL(0,0.75,0.5) HSL(0,0.75,0.5) HSL(0,0,0)],
)
name, ext = splitext(plot)
savefig(plt, joinpath(out_dir,"$(name)_y$(k.year)_s$(k.scenario)$ext"))
end
end
return df
end
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 6813 | # Test of CPLEX Benders decomposition
## Import packages and load common code
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
using Dates
using Memento
using Printf
import CPLEX
_LOGGER = Logger(basename(@__FILE__)[1:end-3]) # A logger for this script, also used by included files.
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl"))
include(joinpath(_FP_dir,"test/benders/cplex.jl"))
include(joinpath(_FP_dir,"test/benders/compare.jl"))
include(joinpath(_FP_dir,"test/benders/perf.jl"))
## Input parameters
# Test case
# | Case | Type | Buses | Hours | Scenarios | Years |
# | ------------- | :----------: | ----: | ----: | --------: | ----: |
# | `case6` | transmission | 6 | 8760 | 35 | 3 |
# | `case67` | transmission | 67 | 8760 | 3 | 3 |
# | `ieee_33` | distribution | 33 | 672 | 4 | 3 |
test_case = "case6"
number_of_hours = 8 # Number of hourly optimization periods
number_of_scenarios = 4 # Number of scenarios (different generation/load profiles)
number_of_years = 3 # Number of years (different investments)
cost_scale_factor = 1e-6 # Cost scale factor (to test the numerical tractability of the problem)
# Procedure
obj_rtol = 1e-6 # Relative tolerance for stopping
# Analysis and output
out_dir = "output"
compare_to_benchmark = true # Solve the problem as MILP, check whether solutions are identical and compare solve times
## Process script parameters, set up logging
test_case_string = @sprintf("%s_%04i_%02i_%1i_%.0e", test_case, number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor)
algorithm_string = @sprintf("cplex")
out_dir = normpath(out_dir, "benders", test_case_string, algorithm_string)
mkpath(out_dir)
main_log_file = joinpath(out_dir,"script.log")
rm(main_log_file; force=true)
filter!(handler -> first(handler)=="console", gethandlers(getlogger())) # Remove from root logger possible previously added handlers
push!(getlogger(), DefaultHandler(main_log_file)) # Tell root logger to write to our log file as well
setlevel!.(Memento.getpath(getlogger(_FP)), "debug") # FlexPlan logger verbosity level. Useful values: "info", "debug", "trace"
info(_LOGGER, "Test case string: \"$test_case_string\"")
info(_LOGGER, "Algorithm string: \"$algorithm_string\"")
info(_LOGGER, " Now is: $(now(UTC)) (UTC)")
## Set CPLEX
optimizer_benders = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(out_dir,"benders.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_MIP_Tolerances_MIPGap" => obj_rtol, # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_Benders_Strategy" => 3, # {-1,..., 3} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-strategy>
"CPXPARAM_Benders_WorkerAlgorithm" => 2, # { 0,..., 5} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-worker-algorithm>
"CPXPARAM_Benders_Tolerances_OptimalityCut" => 1e-6, # [1e-9,1e-1] 1e-6 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-benders-optimality-cut-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
optimizer_benchmark = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(out_dir,"benchmark.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_MIP_Tolerances_MIPGap" => obj_rtol, # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
## Load test case
data, model_type, ref_extensions, solution_processors, setting = eval(Symbol("load_$(test_case)_defaultparams"))(; number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor)
push!(solution_processors, _FP.sol_pm!) # To access pm after the optimization has ended.
## Solve problem
info(_LOGGER, "Solving the problem with CPLEX Benders decomposition...")
result_benders = run_and_time(data, model_type, optimizer_benders, _FP.simple_stoch_flex_tnep; ref_extensions, solution_processors, setting)
info(_LOGGER, @sprintf("CPLEX benders time: %.1f s", result_benders["time"]["total"]))
# Show how many subproblems there are in CPLEX Benders decomposition
annotation_file = joinpath(out_dir, "myprob.ann")
pm = result_benders["solution"]["pm"]
m = get_cplex_optimizer(pm)
CPLEX.CPXwritebendersannotation(m.env, m.lp, annotation_file)
num_subproblems = get_num_subproblems(annotation_file)
info(_LOGGER, "CPLEX Benders decomposition has $num_subproblems subproblems.")
## Solve benchmark and compare
if compare_to_benchmark
info(_LOGGER, "Solving the problem as MILP...")
result_benchmark = run_and_time(data, model_type, optimizer_benchmark, _FP.simple_stoch_flex_tnep; ref_extensions, solution_processors, setting)
info(_LOGGER, @sprintf("MILP time: %.1f s", result_benchmark["time"]["total"]))
info(_LOGGER, @sprintf("Benders/MILP solve time ratio: %.3f", result_benders["time"]["total"]/result_benchmark["time"]["total"]))
check_solution_correctness(result_benders, result_benchmark, obj_rtol, _LOGGER)
end
println("Output files saved in \"$out_dir\"")
info(_LOGGER, "Test completed")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 8176 | # Test of Benders decomposition
## Import packages and load common code
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
using Dates
using Memento
using Printf
import CPLEX
_LOGGER = Logger(basename(@__FILE__)[1:end-3]) # A logger for this script, also used by included files.
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl"))
include(joinpath(_FP_dir,"test/benders/cplex.jl"))
include(joinpath(_FP_dir,"test/benders/compare.jl"))
include(joinpath(_FP_dir,"test/benders/perf.jl"))
include(joinpath(_FP_dir,"test/benders/plots.jl"))
## Input parameters
# Test case
# | Case | Type | Buses | Hours | Scenarios | Years |
# | ------------- | :----------: | ----: | ----: | --------: | ----: |
# | `case6` | transmission | 6 | 8760 | 35 | 3 |
# | `case67` | transmission | 67 | 8760 | 3 | 3 |
# | `ieee_33` | distribution | 33 | 672 | 4 | 3 |
test_case = "case6"
number_of_hours = 8 # Number of hourly optimization periods
number_of_scenarios = 4 # Number of scenarios (different generation/load profiles)
number_of_years = 3 # Number of years (different investments)
cost_scale_factor = 1e-6 # Cost scale factor (to test the numerical tractability of the problem)
# Procedure
algorithm = _FP.Benders.Modern # `_FP.Benders.Classical` or `_FP.Benders.Modern`
obj_rtol = 1e-6 # Relative tolerance for stopping
max_iter = 1000 # Iteration limit
tightening_rtol = 1e-9 # Relative tolerance for adding optimality cuts
silent = true # Suppress solvers output, taking precedence over any other solver attribute
# Analysis and output
out_dir = "output"
make_plots = true # Make the following plots: solution value vs. iterations, decision variables vs. iterations, iteration times
display_plots = true
compare_to_benchmark = true # Solve the problem as MILP, check whether solutions are identical and compare solve times
## Process script parameters, set up logging
algorithm_name = lowercase(last(split(string(algorithm),'.')))
test_case_string = @sprintf("%s_%04i_%02i_%1i_%.0e", test_case, number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor)
algorithm_string = @sprintf("manual_%s", algorithm_name)
out_dir = normpath(out_dir, "benders", test_case_string, algorithm_string)
mkpath(out_dir)
main_log_file = joinpath(out_dir,"script.log")
rm(main_log_file; force=true)
filter!(handler -> first(handler)=="console", gethandlers(getlogger())) # Remove from root logger possible previously added handlers
push!(getlogger(), DefaultHandler(main_log_file)) # Tell root logger to write to our log file as well
setlevel!.(Memento.getpath(getlogger(_FP)), "debug") # FlexPlan logger verbosity level. Useful values: "info", "debug", "trace"
info(_LOGGER, "Test case string: \"$test_case_string\"")
info(_LOGGER, "Algorithm string: \"$algorithm_string\"")
info(_LOGGER, " Now is: $(now(UTC)) (UTC)")
## Set CPLEX
optimizer_MILP = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(out_dir,"milp.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Emphasis_MIP" => 1, # { 0, 5} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-emphasis-switch>
"CPXPARAM_MIP_Tolerances_MIPGap" => obj_rtol, # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_MIP_Strategy_NodeSelect" => 3, # { 0,..., 3} 1 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-selection-strategy>
"CPXPARAM_MIP_Strategy_Branch" => 1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-branching-direction>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
optimizer_LP = _FP.optimizer_with_attributes(CPLEX.Optimizer, # Log file would be interleaved in case of multiple secondary problems. To enable logging, substitute `CPLEX.Optimizer` with: `CPLEX_optimizer_with_logger("lp")`
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_LPMethod" => 2, # { 0,..., 6} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-algorithm-continuous-linear-problems>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
)
optimizer_benchmark = _FP.optimizer_with_attributes(CPLEX_optimizer_with_logger(normpath(out_dir,"benchmark.log")), # Options: <https://www.ibm.com/docs/en/icos/latest?topic=cplex-list-parameters>
# range default link
"CPXPARAM_Read_Scale" => 0, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-scale-parameter>
"CPXPARAM_MIP_Tolerances_MIPGap" => obj_rtol, # [ 0, 1] 1e-4 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-relative-mip-gap-tolerance>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_MIP_Display" => 2, # { 0,..., 5} 2 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-mip-node-log-display-information>
"CPXPARAM_Output_CloneLog" => -1, # {-1,..., 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-clone-log-in-parallel-optimization>
)
## Load test case
data, model_type, ref_extensions, solution_processors, setting = eval(Symbol("load_$(test_case)_defaultparams"))(; number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor)
## Solve problem
info(_LOGGER, "Solving the problem with Benders decomposition...")
algo = algorithm == _FP.Benders.Classical ? algorithm(; obj_rtol, max_iter, tightening_rtol, silent) : algorithm(; max_iter, tightening_rtol, silent)
result_benders = _FP.run_benders_decomposition(
algo,
data, model_type,
optimizer_MILP, optimizer_LP,
_FP.build_simple_stoch_flex_tnep_benders_main,
_FP.build_simple_stoch_flex_tnep_benders_secondary;
ref_extensions, solution_processors, setting
)
if result_benders["termination_status"] != _FP.OPTIMAL
Memento.warn(_LOGGER, "Termination status: $(result_benders["termination_status"]).")
end
## Make plots
if make_plots
info(_LOGGER, "Making plots...")
make_benders_plots(data, result_benders, out_dir; display_plots)
end
## Solve benchmark and compare
if compare_to_benchmark
info(_LOGGER, "Solving the problem as MILP...")
result_benchmark = run_and_time(data, model_type, optimizer_benchmark, _FP.simple_stoch_flex_tnep; ref_extensions, solution_processors, setting)
info(_LOGGER, @sprintf("MILP time: %.1f s", result_benchmark["time"]["total"]))
info(_LOGGER, @sprintf("Benders/MILP solve time ratio: %.3f", result_benders["time"]["total"]/result_benchmark["time"]["total"]))
check_solution_correctness(result_benders, result_benchmark, obj_rtol, _LOGGER)
end
println("Output files saved in \"$out_dir\"")
info(_LOGGER, "Test completed")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 6240 | # Test of Benders decomposition performance
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
using Memento
using Printf
_LOGGER = Logger(basename(@__FILE__)[1:end-3]) # A logger for this script, also used by included files.
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl"))
include(joinpath(_FP_dir,"test/benders/cplex.jl"))
include(joinpath(_FP_dir,"test/benders/perf.jl"))
include(joinpath(_FP_dir,"test/benders/plots.jl"))
## Settings
settings = Dict(
:session => Dict{Symbol,Any}(
:out_dir => "output/benders/perf",
:repetitions => 3, # How many times to run each optimization
),
:case => Dict{Symbol, Any}(
:cost_scale_factor => 1e-6, # Cost scale factor (to test the numerical tractability of the problem)
),
:optimization => Dict{Symbol,Any}(
:obj_rtol => 1e-4, # Relative tolerance for stopping
:max_iter => 10000, # Iteration limit
:tightening_rtol => 1e-9, # Relative tolerance for adding optimality cuts
:silent => true, # Suppress solvers output, taking precedence over any other solver attribute
)
)
## Set up paths and logging
settings[:session][:tasks_dir] = mkpath(joinpath(settings[:session][:out_dir],"tasks"))
settings[:session][:results_dir] = mkpath(joinpath(settings[:session][:out_dir],"results"))
datetime_format = "yyyymmddTHHMMSS\\Z" # As per ISO 8601. Basic format (i.e. without separators) is used for consistency across operating systems.
setlevel!.(Memento.getpath(getlogger(_FP)), "debug") # Log messages from FlexPlan having level >= "debug"
root_logger = getlogger()
push!(gethandlers(root_logger)["console"], Memento.Filter(rec -> rec.name==_LOGGER.name || root_logger.levels[getlevel(rec)]>=root_logger.levels["warn"])) # Filter console output: display all records from this script and records from other loggers having level >= "warn"
script_start_time = now(UTC)
main_log_file = joinpath(settings[:session][:out_dir],basename(@__FILE__)[1:end-3]*"-$(Dates.format(script_start_time,datetime_format)).log")
rm(main_log_file; force=true)
function switch_log_file(new_log_file::String)
filter!(handler -> first(handler)=="console", gethandlers(root_logger)) # Remove from root logger possible previously added handlers
push!(getlogger(), DefaultHandler(new_log_file)) # Tell root logger to write to our log file as well
end
switch_log_file(main_log_file)
notice(_LOGGER, "Performance tests for Benders decomposition started.")
info(_LOGGER, "Script start time: $script_start_time (UTC)")
info(_LOGGER, "Available threads: $(Threads.nthreads())")
## Set up tests
notice(_LOGGER, "Setting up tests...")
params = Dict(
:case => [:test_case=>String, :number_of_hours=>Int, :number_of_scenarios=>Int, :number_of_years=>Int],
:optimization => [
:algorithm => String, # Possible values: `benchmark`, `cplex_auto`, `manual_classical`, `manual_modern`
:preprocessing_repeatpresolve => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_search => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:emphasis_mip => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_nodeselect => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_variableselect => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_bbinterval => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_branch => Int, # Only used by `manual_classical` and `manual_modern` algorithms
:mip_strategy_probe => Int, # Only used by `manual_classical` and `manual_modern` algorithms
]
)
tasks = initialize_tasks(params)
# Toy job, just to run the script and see some results
add_tasks!(tasks; test_case="case6", number_of_hours=[2,4], number_of_scenarios=4, number_of_years=3,
algorithm = ["manual_classical","manual_modern"],
preprocessing_repeatpresolve = -1,
mip_strategy_search = 2,
emphasis_mip = 1,
mip_strategy_nodeselect = 3,
mip_strategy_variableselect = 0,
mip_strategy_bbinterval = 7,
mip_strategy_branch = 1,
mip_strategy_probe = 0,
)
# Example: test how performance changes by varying one or more optimization parameters
#add_tasks!(tasks; test_case="case67", number_of_hours=[2,4], number_of_scenarios=3, number_of_years=3,
# algorithm = "manual_modern",
# preprocessing_repeatpresolve = -1,
# mip_strategy_search = 2,
# emphasis_mip = 1,
# mip_strategy_nodeselect = 3,
# mip_strategy_variableselect = 0,
# mip_strategy_bbinterval = 7,
# mip_strategy_branch = [0,1],
# mip_strategy_probe = 0,
#)
## Warm up
notice(_LOGGER, "Warming up...")
warmup_tasks = similar(tasks, 0)
add_tasks!(warmup_tasks; test_case="case6", number_of_hours=1, number_of_scenarios=1, number_of_years=1, algorithm=unique(tasks.algorithm), preprocessing_repeatpresolve=-1, mip_strategy_search=2, emphasis_mip=1, mip_strategy_nodeselect=3, mip_strategy_variableselect=0, mip_strategy_bbinterval=7, mip_strategy_branch=1, mip_strategy_probe=0)
warmup_dir = joinpath(settings[:session][:out_dir], "warmup")
rm(warmup_dir; force=true, recursive=true)
mkpath(warmup_dir)
warmup_settings = Dict(
:case => settings[:case],
:optimization => settings[:optimization],
:session => Dict{Symbol, Any}(
:out_dir => warmup_dir,
:tasks_dir => mkpath(joinpath(warmup_dir,"tasks")),
:results_dir => mkpath(joinpath(warmup_dir,"results")),
:repetitions => 1
)
)
run_performance_tests(warmup_tasks, params, warmup_settings; use_existing_results=false)
## Run tests
notice(_LOGGER, "Running tests...")
results = run_performance_tests(tasks, params, settings; use_existing_results=true)
## Analyze results
notice(_LOGGER, "Analyzing results...")
make_benders_perf_plots(results, settings[:session][:results_dir])
notice(_LOGGER, "Performance tests for Benders decomposition ended.")
## Analyze results of a previous run
#make_benders_perf_plots("output/benders/my_old_perf_run/results")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 1873 | # Test the JSON converter functionality
## Import packages
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
import HiGHS
optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
## Script parameters
file = joinpath(_FP_dir,"test/data/json_converter/case6_input_file_2018-2019.json")
## Parse the JSON file to easily check the input
#import JSON
#d = JSON.parsefile(file)
## Convert JSON file
mn_data = _FP.convert_JSON(file; year_scale_factor=10) # Conversion caveats and function parameters: see function documentation
## Instantiate model and solve network expansion problem
# Transmission network
setting = Dict("conv_losses_mp" => true)
result = _FP.simple_stoch_flex_tnep(mn_data, _PM.DCPPowerModel, optimizer; setting)
# Two-step alternative: exposes `pm`
#pm = _PM.instantiate_model(mn_data, _PM.DCPPowerModel, _FP.build_simple_stoch_flex_tnep; setting, ref_extensions=[_FP.ref_add_gen!, _PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!])
#result = _PM.optimize_model!(pm; optimizer=optimizer)
# Distribution network
#result = _FP.simple_stoch_flex_tnep(mn_data, _FP.BFARadPowerModel, optimizer)
# Two-step alternative: exposes `pm`
#pm = _PM.instantiate_model(mn_data, _FP.BFARadPowerModel, _FP.build_simple_stoch_flex_tnep; ref_extensions=[_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _FP.ref_add_ne_branch_allbranches!, _FP.ref_add_frb_branch!, _FP.ref_add_oltc_branch!])
#result = _PM.optimize_model!(pm; optimizer=optimizer, solution_processors=[_PM.sol_data_model!])
println("Test completed")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 3164 | # Example script to test the branch replacement feature in multi-period optimization of demand flexibility, AC & DC lines and storage investments
## Import relevant packages
import PowerModels as _PM # For AC grid and common functions
import PowerModelsACDC as _PMACDC # For DC grid
import FlexPlan as _FP
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/create_profile.jl")) # Include sample data from FlexPlan repository; you can of course also use your own data
# Add solver packages
# > Note: solver packages are needed to handle communication between the solver and JuMP;
# > the commercial ones do not include the solver itself.
import HiGHS
optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
#import CPLEX
#optimizer = _FP.optimizer_with_attributes(CPLEX.Optimizer, "CPXPARAM_ScreenOutput"=>0)
## Input parameters
number_of_hours = 24 # Number of time points
planning_horizon = 10 # Years to scale generation costs
file = joinpath(_FP_dir,"test/data/case6/case6_replacement.m") # Input case, in Matpower m-file format: here 6-bus case with candidate AC, DC lines and candidate storage
scenario_properties = Dict(
1 => Dict{String,Any}("probability"=>0.5, "start"=>1514764800000), # 1514764800000 is 2018-01-01T00:00, needed by `create_profile_data_italy!` when `mc=false`
2 => Dict{String,Any}("probability"=>0.5, "start"=>1546300800000), # 1546300800000 is 2019-01-01T00:00, needed by `create_profile_data_italy!` when `mc=false`
)
scenario_metadata = Dict{String,Any}("mc"=>false) # Needed by `create_profile_data_italy!`
## Load test case
data = _FP.parse_file(file) # Parse input file to obtain data dictionary
_FP.add_dimension!(data, :hour, number_of_hours) # Add dimension, e.g. hours
_FP.add_dimension!(data, :scenario, scenario_properties; metadata=scenario_metadata) # Add dimension, e.g. scenarios
_FP.add_dimension!(data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>planning_horizon)) # Add_dimension, e.g. years
_FP.scale_data!(data) # Scale investment & operational cost data based on planning years & hours
data, loadprofile, genprofile = create_profile_data_italy!(data) # Load time series data based demand and RES profiles of the six market zones in Italy from the data folder
time_series = create_profile_data(number_of_hours*_FP.dim_length(data,:scenario), data, loadprofile, genprofile) # Create time series data to be passed to the data dictionay
mn_data = _FP.make_multinetwork(data, time_series) # Create the multinetwork data dictionary
## Solve the planning problem
# PowerModels(ACDC) and FlexPlan settings
s = Dict("conv_losses_mp" => false, "allow_line_replacement" => true)
# Build optimisation model, solve it and write solution dictionary:
# This is the "problem file" which needs to be constructed individually depending on application
# In this case: multi-period optimisation of demand flexibility, AC & DC lines and storage investments
println("Solving planning problem...")
result = _FP.simple_stoch_flex_tnep(mn_data, _PM.DCPPowerModel, optimizer; setting = s)
println("Test completed")
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | code | 15103 | # Test of transmission and distribution decoupling
# T&D decoupling procedure
# 1. Compute a surrogate model of distribution networks
# 2. Optimize planning of transmission network using surrogate distribution networks
# 3. Fix power exchanges between T&D and optimize planning of distribution networks
## Import packages
using Dates
using Memento
_LOGGER = Logger(first(splitext(basename(@__FILE__)))) # A logger for this script, also used by included files.
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl"))
include(joinpath(_FP_dir,"test/io/sol.jl"))
include(joinpath(_FP_dir,"test/io/td_decoupling.jl"))
## Set script parameters
number_of_hours = 24
number_of_scenarios = 1
number_of_years = 1
number_of_distribution_networks = 4
t_model_type = _PM.DCPPowerModel
d_model_type = _FP.BFARadPowerModel
build_method = _FP.build_simple_stoch_flex_tnep
t_ref_extensions = [_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _PM.ref_add_ne_branch!, _PMACDC.add_ref_dcgrid!, _PMACDC.add_candidate_dcgrid!]
d_ref_extensions = [_FP.ref_add_gen!, _FP.ref_add_storage!, _FP.ref_add_ne_storage!, _FP.ref_add_flex_load!, _PM.ref_add_on_off_va_bounds!, _FP.ref_add_ne_branch_allbranches!, _FP.ref_add_frb_branch!, _FP.ref_add_oltc_branch!]
t_solution_processors = [_PM.sol_data_model!]
d_solution_processors = [_PM.sol_data_model!, _FP.sol_td_coupling!]
t_setting = Dict("conv_losses_mp" => false)
d_setting = Dict{String,Any}()
cost_scale_factor = 1e-6
solver = "highs"
report_intermediate_results = false
report_result = false
compare_with_combined_td_model = true
out_dir = joinpath("output", "td_decoupling")
## Set up optimizers
# For each solver, 2 optimizers should be defined:
# - one multi-threaded optimizer, used for transmission network planning (MILP problem);
# - one single-threaded optimizer, used for planning of distribution networks (one MILP and
# several LP problems) in a multi-threaded loop (one thread per distribution network).
# `direct_model` parameter can be used to construct JuMP models using `JuMP.direct_model()`
# instead of `JuMP.Model()`. Note that `JuMP.direct_model` is only supported by some
# solvers.
if solver == "highs"
import HiGHS
direct_model = false
optimizer_mt = _FP.optimizer_with_attributes(HiGHS.Optimizer, # Parameters: <https://github.com/jump-dev/HiGHS.jl>
"threads" => 0,
"output_flag" => false,
)
optimizer_st = _FP.optimizer_with_attributes(HiGHS.Optimizer, # Parameters: <https://github.com/jump-dev/HiGHS.jl>
"threads" => 1,
"output_flag" => false,
)
elseif solver == "cplex"
import CPLEX
direct_model = true
optimizer_mt = _FP.optimizer_with_attributes(CPLEX.Optimizer,
# range default link
"CPXPARAM_LPMethod" => 0, # { 0,..., 6} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-algorithm-continuous-linear-problems>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_Threads" => 0, # { 0,1,... } 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-global-thread-count>
)
optimizer_st = _FP.optimizer_with_attributes(CPLEX.Optimizer,
# range default link
"CPXPARAM_LPMethod" => 1, # { 0,..., 6} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-algorithm-continuous-linear-problems>
"CPXPARAM_ScreenOutput" => 0, # { 0, 1} 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-messages-screen-switch>
"CPXPARAM_Threads" => 1, # { 0,1,... } 0 <https://www.ibm.com/docs/en/icos/latest?topic=parameters-global-thread-count>
)
else
error(_LOGGER, "No optimizers defined for solver \"$solver\" (but you can define them yourself!).")
end
## Set up logging
setlevel!.(Memento.getpath(getlogger(_PM)), "notice") # PowerModels logger verbosity level. Useful values: "error", "warn", "notice", "info"
setlevel!.(Memento.getpath(getlogger(_FP)), "debug") # FlexPlan logger verbosity level. Useful values: "info", "debug", "trace"
info(_LOGGER, "Now is: $(now(UTC)) (UTC)")
time_start = time()
## Load data
# JSON files containing either transmission or distribution networks can be loaded with
# `data = _FP.convert_JSON(file_path)`; those containing both transmission and distribution
# networks can be loaded with `t_data, d_data = _FP.convert_JSON_td(file_path)`.
# Transmission network data
t_data = load_case6(; number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor, share_data=false)
# Distribution network data
d_data_sub = load_ieee_33(; number_of_hours, number_of_scenarios, number_of_years, cost_scale_factor)
# Alternative distribution network. It has only 1 scenario and 1 year.
#d_data_sub = load_cigre_mv_eu(; flex_load=false, ne_storage=true, scale_gen=1.0, scale_wind=6.0, scale_load=1.0, year_scale_factor=10, number_of_hours, start_period=1, cost_scale_factor)
d_data = Vector{Dict{String,Any}}(undef, number_of_distribution_networks)
transmission_ac_buses = length(first(values(t_data["nw"]))["bus"])
for s in 1:number_of_distribution_networks
d_data[s] = deepcopy(d_data_sub)
d_data[s]["t_bus"] = mod1(s, transmission_ac_buses) # Attach distribution network to a transmission network bus
end
## Compute optimal planning using T&D decoupling procedure
info(_LOGGER, "Solving planning problem using T&D decoupling...")
result_decoupling = _FP.run_td_decoupling(
t_data, d_data, t_model_type, d_model_type, optimizer_mt, optimizer_st, build_method;
t_ref_extensions, d_ref_extensions, t_solution_processors, d_solution_processors, t_setting, d_setting, direct_model
)
info(_LOGGER, "T&D decoupling procedure took $(round(result_decoupling["solve_time"]; sigdigits=3)) seconds")
## Report results
if report_intermediate_results
info(_LOGGER, "Reporting intermediate results of T&D decoupling procedure for first distribution network...")
# Intermediate solutions used for building the surrogate model
d_data_intermediate = deepcopy(first(d_data))
_FP.add_dimension!(d_data_intermediate, :sub_nw, Dict(1 => Dict{String,Any}("d_gen"=>_FP._get_reference_gen(d_data_intermediate))))
sol_up, sol_base, sol_down = _FP.TDDecoupling.probe_distribution_flexibility!(d_data_intermediate; model_type=d_model_type, optimizer=optimizer_mt, build_method, ref_extensions=d_ref_extensions, solution_processors=d_solution_processors, setting=d_setting, direct_model)
intermediate_results_dir = mkpath(joinpath(out_dir, "intermediate_results"))
for (sol,name) in [(sol_up,"up"), (sol_base,"base"), (sol_down,"down")]
subdir = mkpath(joinpath(intermediate_results_dir, name))
sol_report_cost_summary(sol, d_data_intermediate; out_dir=subdir, table="t_cost.csv", plot="cost.pdf")
sol_report_power_summary(sol, d_data_intermediate; td_coupling=true, out_dir=subdir, table="t_power.csv", plot="power.pdf")
sol_report_branch(sol, d_data_intermediate; rated_power_scale_factor=cos(π/8), out_dir=subdir, table="t_branch.csv", plot="branch.pdf") # `cos(π/8)` is due to octagonal approximation of apparent power in `_FP.BFARadPowerModel`
sol_report_bus_voltage_magnitude(sol, d_data_intermediate; out_dir=subdir, table="t_bus.csv", plot="bus.pdf")
sol_report_gen(sol, d_data_intermediate; out_dir=subdir, table="t_gen.csv", plot="gen.pdf")
sol_report_load(sol, d_data_intermediate; out_dir=subdir, table="t_load.csv", plot="load.pdf")
sol_report_load_summary(sol, d_data_intermediate; out_dir=subdir, table="t_load_summary.csv", plot="load_summary.pdf")
if name == "base"
sol_report_investment(sol, d_data_intermediate; out_dir=subdir, table="t_investment.csv")
sol_report_investment_summary(sol, d_data_intermediate; out_dir=subdir, table="t_investment_summary.csv", plot="investment_summary.pdf")
sol_report_storage(sol, d_data_intermediate; out_dir=subdir, table="t_storage.csv", plot="storage.pdf")
sol_report_storage_summary(sol, d_data_intermediate; out_dir=subdir, table="t_storage_summary.csv", plot="storage_summary.pdf")
end
sol_graph(sol, d_data_intermediate; plot="map.pdf", out_dir=subdir, hour=1) # Just as an example; dimension coordinates can also be vectors, or be omitted, in which case one plot for each coordinate will be generated.
end
# Surrogate model
surrogate_dist = _FP.TDDecoupling.calc_surrogate_model(d_data_intermediate, sol_up, sol_base, sol_down; standalone=true)
surrogate_subdir = mkpath(joinpath(intermediate_results_dir, "surrogate"))
sol_report_decoupling_pcc_power(sol_up, sol_base, sol_down, d_data_intermediate, surrogate_dist; model_type=d_model_type, optimizer=optimizer_mt, build_method, ref_extensions=d_ref_extensions, solution_processors=d_solution_processors, out_dir=intermediate_results_dir, table="t_pcc_power.csv", plot="pcc_power.pdf")
# Planning obtained by using the surrogate model as it were an ordinary distribution network
sol_surr = _FP.TDDecoupling.run_td_decoupling_model(surrogate_dist; model_type=d_model_type, optimizer=optimizer_mt, build_method, ref_extensions=d_ref_extensions, solution_processors=d_solution_processors, setting=d_setting)
sol_report_cost_summary(sol_surr, surrogate_dist; out_dir=surrogate_subdir, table="t_cost.csv", plot="cost.pdf")
sol_report_power_summary(sol_surr, surrogate_dist; td_coupling=true, out_dir=surrogate_subdir, table="t_power.csv", plot="power.pdf")
sol_report_gen(sol_surr, surrogate_dist; out_dir=surrogate_subdir, table="t_gen.csv", plot="gen.pdf")
sol_report_load_summary(sol_surr, surrogate_dist; out_dir=surrogate_subdir, table="t_load_summary.csv", plot="load_summary.pdf")
sol_report_storage_summary(sol_surr, surrogate_dist; out_dir=surrogate_subdir, table="t_storage_summary.csv", plot="storage_summary.pdf")
end
if report_result
info(_LOGGER, "Reporting results of T&D decoupling procedure...")
result_dir = mkpath(joinpath(out_dir, "result"))
t_sol = result_decoupling["t_solution"]
t_subdir = mkpath(joinpath(result_dir, "transmission"))
sol_report_cost_summary(t_sol, t_data; out_dir=t_subdir, table="t_cost.csv", plot="cost.pdf")
sol_report_power_summary(t_sol, t_data; out_dir=t_subdir, table="t_power.csv", plot="power.pdf")
sol_report_branch(t_sol, t_data, out_dir=t_subdir, table="t_branch.csv", plot="branch.pdf")
sol_report_bus_voltage_angle(t_sol, t_data; out_dir=t_subdir, table="t_bus.csv", plot="bus.pdf")
sol_report_gen(t_sol, t_data; out_dir=t_subdir, table="t_gen.csv", plot="gen.pdf")
sol_report_load(t_sol, t_data; out_dir=t_subdir, table="t_load.csv", plot="load.pdf")
sol_report_load_summary(t_sol, t_data; out_dir=t_subdir, table="t_load_summary.csv", plot="load_summary.pdf")
sol_report_investment(t_sol, t_data; out_dir=t_subdir, table="t_investment.csv")
sol_report_investment_summary(t_sol, t_data; out_dir=t_subdir, table="t_investment_summary.csv", plot="investment_summary.pdf")
sol_report_storage(t_sol, t_data; out_dir=t_subdir, table="t_storage.csv", plot="storage.pdf")
sol_report_storage_summary(t_sol, t_data; out_dir=t_subdir, table="t_storage_summary.csv", plot="storage_summary.pdf")
sol_graph(t_sol, t_data; plot="map.pdf", out_dir=t_subdir, hour=1) # Just as an example; dimension coordinates can also be vectors, or be omitted, in which case one plot for each coordinate will be generated.
for (s,sol) in enumerate(result_decoupling["d_solution"])
subdir = mkpath(joinpath(result_dir, "distribution_$s"))
sol_report_cost_summary(sol, d_data[s]; td_coupling=false, out_dir=subdir, table="t_cost.csv", plot="cost.pdf") # `td_coupling=false` because even if data dictionary specifies a positive cost it must not be considered.
sol_report_power_summary(sol, d_data[s]; td_coupling=true, out_dir=subdir, table="t_power.csv", plot="power.pdf")
sol_report_branch(sol, d_data[s]; rated_power_scale_factor=cos(π/8), out_dir=subdir, table="t_branch.csv", plot="branch.pdf") # `cos(π/8)` is due to octagonal approximation of apparent power in `_FP.BFARadPowerModel`
sol_report_bus_voltage_magnitude(sol, d_data[s]; out_dir=subdir, table="t_bus.csv", plot="bus.pdf")
sol_report_gen(sol, d_data[s]; out_dir=subdir, table="t_gen.csv", plot="gen.pdf")
sol_report_load(sol, d_data[s]; out_dir=subdir, table="t_load.csv", plot="load.pdf")
sol_report_load_summary(sol, d_data[s]; out_dir=subdir, table="t_load_summary.csv", plot="load_summary.pdf")
sol_report_investment(sol, d_data[s]; out_dir=subdir, table="t_investment.csv")
sol_report_investment_summary(sol, d_data[s]; out_dir=subdir, table="t_investment_summary.csv", plot="investment_summary.pdf")
sol_report_storage(sol, d_data[s]; out_dir=subdir, table="t_storage.csv", plot="storage.pdf")
sol_report_storage_summary(sol, d_data[s]; out_dir=subdir, table="t_storage_summary.csv", plot="storage_summary.pdf")
sol_graph(sol, d_data[s]; plot="map.pdf", out_dir=subdir, hour=1) # Just as an example; dimension coordinates can also be vectors, or be omitted, in which case one plot for each coordinate will be generated.
end
end
## Compare with combined T&D model
if compare_with_combined_td_model
info(_LOGGER, "Solving planning problem using combined T&D model...")
result_combined = _FP.solve_model(
t_data, d_data, t_model_type, d_model_type, optimizer_mt, build_method;
t_ref_extensions, d_ref_extensions, t_solution_processors, d_solution_processors, t_setting, d_setting, direct_model
)
info(_LOGGER, "Solution of combined T&D model took $(round(result_combined["solve_time"]; sigdigits=3)) seconds")
obj_combined = result_combined["objective"]
obj_decoupling = result_decoupling["objective"]
diff = obj_decoupling - obj_combined
ratio = obj_decoupling / obj_combined
digits = max(1, ceil(Int,-log10(abs(diff)))+1)
info(_LOGGER, "Combined T&D model objective: $(round(obj_combined; digits))")
info(_LOGGER, " T&D decoupling objective: $(round(obj_decoupling; digits)) (signed relative error: $(round((ratio-1); sigdigits=2)))")
if diff < 0
warn(_LOGGER, "T&D decoupling objective is less than that of combined T&D model. This should not happen!")
end
end
notice(_LOGGER, "Test completed in $(round(time()-time_start;sigdigits=3)) seconds." * ((report_result || report_intermediate_results) ? " Results saved in $out_dir" : ""))
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 835 | # FlexPlan.jl changelog
All notable changes to FlexPlan.jl will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
### Changed
- Upgrade dependency: PowerModelsACDC v0.7 (v0.6 still allowed)
## [0.3.1] - 2023-06-14
### Added
- This changelog
- Ability to choose period duration when importing JSON files
## [0.3.0] - 2022-12-19
For changes up to 0.3.0 refer to
[GitHub Releases page](https://github.com/Electa-Git/FlexPlan.jl/releases/).
[unreleased]: https://github.com/Electa-Git/FlexPlan.jl/compare/v0.3.1...HEAD
[0.3.1]: https://github.com/Electa-Git/FlexPlan.jl/compare/v0.3.0...v0.3.1
[0.3.0]: https://github.com/Electa-Git/FlexPlan.jl/releases/tag/v0.3.0
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 4562 | # FlexPlan.jl
Status:
[](https://github.com/Electa-Git/FlexPlan.jl/actions?query=workflow%3ACI)
<a href="https://codecov.io/gh/Electa-Git/FlexPlan.jl"><img src="https://img.shields.io/codecov/c/github/Electa-Git/FlexPlan.jl?logo=Codecov"></img></a>
<a href="https://electa-git.github.io/FlexPlan.jl/dev/"><img src="https://github.com/Electa-Git/FlexPlan.jl/workflows/Documentation/badge.svg"></img></a>
[](https://zenodo.org/badge/latestdoi/293785598)
## Overview
FlexPlan.jl is a Julia/JuMP package to carry out transmission and distribution network planning considering AC and DC technology, storage and demand flexibility as possible expansion candidates.
Using time series input on renewable generation and demand, as well a list of candidates for grid expansion, a mixed-integer linear problem is constructed which can be solved with any commercial or open-source MILP solver.
The package builds upon the [PowerModels](https://github.com/lanl-ansi/PowerModels.jl) and [PowerModelsACDC](https://github.com/Electa-Git/PowerModelsACDC.jl) packages, and uses a similar structure.
Modelling features provided by the package include:
- Joint multistage, multiperiod formulation to model a number of planning years, and planning hours within years for a sequential grid expansion plan.
- Stochastic formulation of the planning problem, based on scenario probabilities for a number of different time series.
- Extensive, parametrized models for storage, demand flexibility and DC grids.
- Linearized DistFlow model for radial distribution networks, considering reactive power and voltage magnitudes.
- Support of networks composed of transmission and distribution (T&D), with the possibility of using two different power flow models.
- Heuristic procedure for efficient, near-optimal planning of T&D networks.
- Basic implementations of Benders decomposition algorithm to efficiently solve the stochastic planning problem.
## Documentation
The package [documentation](https://electa-git.github.io/FlexPlan.jl/dev/) includes useful information comprising links to [example scripts](https://electa-git.github.io/FlexPlan.jl/dev/examples/) and a [tutorial](https://electa-git.github.io/FlexPlan.jl/dev/tutorial/).
Additionally, these presentations provide a brief introduction to various aspects of FlexPlan:
- Network expansion planning with FlexPlan.jl [[PDF](/docs/src/assets/20230216_flexplan_seminar_energyville.pdf)] – EnergyVille, 16/02/2023
- FlexPlan.jl – An open-source Julia tool for holistic transmission and distribution grid planning [[PDF](/docs/src/assets/20230328_osmses2023_conference.pdf)] – OSMSES 2023 conference, Aachen, 28/03/2023
All notable changes to the source code are documented in the [changelog](/CHANGELOG.md).
## Installation of FlexPlan
From Julia, FlexPlan can be installed using the built-in package manager:
```julia
using Pkg
Pkg.add("FlexPlan")
```
## Development
FlexPlan.jl is research-grade software and is constantly being improved and extended.
If you have suggestions for improvement, please contact us via the Issues page on the repository.
## Acknowledgements
This code has been developed as part of the European Union’s Horizon 2020 research and innovation programme under the FlexPlan project (grant agreement no. 863819).
Developed by:
- Hakan Ergun (KU Leuven / EnergyVille)
- Matteo Rossini (RSE)
- Marco Rossi (RSE)
- Damien Lepage (N-Side)
- Iver Bakken Sperstad (SINTEF)
- Espen Flo Bødal (SINTEF)
- Merkebu Zenebe Degefa (SINTEF)
- Reinhilde D'Hulst (VITO / EnergyVille)
The developers thank Carleton Coffrin (Los Alamos National Laboratory) for his countless design tips.
## Citing FlexPlan.jl
If you find FlexPlan.jl useful in your work, we kindly request that you cite the following [publication](https://doi.org/10.1109/osmses58477.2023.10089624) ([preprint](https://doi.org/10.5281/zenodo.7705908)):
```bibtex
@inproceedings{FlexPlan.jl,
author = {Matteo Rossini and Hakan Ergun and Marco Rossi},
title = {{FlexPlan}.jl – An open-source {Julia} tool for holistic transmission and distribution grid planning},
booktitle = {2023 Open Source Modelling and Simulation of Energy Systems ({OSMSES})},
year = {2023},
month = {mar},
publisher = {{IEEE}},
doi = {10.1109/osmses58477.2023.10089624},
url = {https://doi.org/10.1109/osmses58477.2023.10089624}
}
```
## License
This code is provided under a [BSD 3-Clause License](/LICENSE.md).
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 1121 | # Documentation for FlexPlan.jl
You can read this documentation online at <https://electa-git.github.io/FlexPlan.jl/dev/>.
## Preview the documentation (for developers)
While developing FlexPlan you can also preview the documentation locally in your browser
with live-reload capability, i.e. when modifying a file, every browser (tab) currently
displaying the corresponding page is automatically refreshed.
### Instructions for *nix
1. Copy the following zsh/Julia code snippet:
```julia
#!/bin/zsh
#= # Following line is zsh code
julia -i $0:a # The string `$0:a` represents this file in zsh
=# # Following lines are Julia code
import Pkg
Pkg.activate(; temp=true)
Pkg.develop("FlexPlan")
Pkg.add("Documenter")
Pkg.add("LiveServer")
using FlexPlan, LiveServer
cd(dirname(dirname(pathof(FlexPlan))))
servedocs()
exit()
```
2. Save it as a zsh script (name it like `preview_flexplan_docs.sh`).
3. Assign execute permission to the script: `chmod u+x preview_flexplan_docs.sh`.
4. Run the script.
5. Open your favorite web browser and navigate to `http://localhost:8000`.
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 1242 | # Data model
FlexPlan.jl extends data models of the ```PowerModels.jl``` and ```PowerModelsACDC.jl``` packages by including candidate storage devices, ```:ne_storage```, additional fields to parametrize the demand flexibility models which extend ```:load```, some additional parameters to both existing and candidate storage devices to represent external charging and discharging of storage, e.g., to represent natural inflow and dissipation of water in hydro storage, some additional parameters extending ```:gen``` to include air quality impact and CO2 emission costs for the generators.
For the full data model please consult the FlexPlan deliverable 1.2 ["Probabilistic optimization of T&D systems planning with high grid flexibility and its scalability"](https://flexplan-project.eu/wp-content/uploads/2022/08/D1.2_20220801_V2.0.pdf)
```
@article{ergun2021probabilistic,
title={Probabilistic optimization of T\&D systems planning with high grid flexibility and its scalability},
author={Ergun, Hakan and Sperstad, Iver Bakken and Espen Flo, B{\o}dal and Siface, Dario and Pirovano, Guido and Rossi, Marco and Rossini, Matteo and Marmiroli, Benedetta and Agresti, Valentina and Costa, Matteo Paolo and others},
year={2021}
}
``` | FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 1343 | # Model dimensions
All the optimization problems modeled in FlexPlan are multiperiod and make use of the following _dimensions_:
- `hour`: the finest time granularity that can be represented in a model.
During an hour, each continuous variable has a constant value.
- `year`: an investment period.
Different investment decisions can be made in different years.
- `scenario`: one of the different possible sets of values related to renewable generation and consumption data.
These dimensions must be defined in each model by calling the function `add_dimension!` on single-period data dictionaries.
The `add_dimension!` function takes the name of the dimension as input in form of key, as well as either integer values, e.g., for number of hours or years, or a dictionary, e.g., containing multiple scenarios. In the case of scenario input, probabilities and other meta data can be added. An example can be found below:
```julia
_FP.add_dimension!(data, :hour, number_of_hours) # Add dimension, e.g. number of hours
_FP.add_dimension!(data, :scenario, Dict(1 => Dict{String,Any}("probability"=>1)), metadata = Dict{String,Any}("mc"=>true)) # Add dimension, e.g. number of scenarios
_FP.add_dimension!(t_data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1)) # Add dimension of years, using cost scaling factors in metadata
```
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 854 | # Examples
Some scripts are provided in [`/examples/`](https://github.com/Electa-Git/FlexPlan.jl/tree/master/examples) and in [`/test/scripts/`](https://github.com/Electa-Git/FlexPlan.jl/tree/master/test/scripts) to test the package functionality.
## How to run scripts
To run the above scripts, you need to activate an environment and import all the needed packages.
1. In a Julia REPL, choose a directory where to create the environment:
```
julia> cd("path/to/env/dir")
```
2. Enter the Pkg REPL by pressing `]` from the Julia REPL:
```
julia> ]
```
3. Activate the environment:
```
pkg> activate .
```
4. `add` the FlexPlan package:
```
pkg> add FlexPlan
```
5. `add` every package required by the script.
For example, if the script contains `import Plots`, then execute
```
pkg> add Plots
```
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 3352 | # FlexPlan.jl Documentation
```@meta
CurrentModule = FlexPlan
```
## Overview
FlexPlan.jl is a Julia/JuMP package to carry out transmission and distribution network planning considering AC and DC technology, storage and demand flexibility as possible expansion candidates.
Using time series input on renewable generation and demand, as well a list of candidates for grid expansion, a mixed-integer linear problem is constructed which can be solved with any commercial or open-source MILP solver.
The package builds upon the [PowerModels](https://github.com/lanl-ansi/PowerModels.jl) and [PowerModelsACDC](https://github.com/Electa-Git/PowerModelsACDC.jl) packages, and uses a similar structure.
Modelling features provided by the package include:
- Joint multistage, multiperiod formulation to model a number of planning years, and planning hours within years for a sequential grid expansion plan.
- Stochastic formulation of the planning problem, based on scenario probabilities for a number of different time series.
- Extensive, parametrized models for storage, demand flexibility and DC grids.
- Linearized DistFlow model for radial distribution networks, considering reactive power and voltage magnitudes.
- Support of networks composed of transmission and distribution (T&D), with the possibility of using two different power flow models.
- Heuristic procedure for efficient, near-optimal planning of T&D networks.
- Basic implementations of Benders decomposition algorithm to efficiently solve the stochastic planning problem.
These presentations provide a brief introduction to various aspects of FlexPlan:
- Network expansion planning with FlexPlan.jl [[PDF](./assets/20230216_flexplan_seminar_energyville.pdf)] – EnergyVille, 16/02/2023
- FlexPlan.jl – An open-source Julia tool for holistic transmission and distribution grid planning [[PDF](./assets/20230328_osmses2023_conference.pdf)] – OSMSES 2023 conference, Aachen, 28/03/2023
## Acknowledgements
This code has been developed as part of the European Union’s Horizon 2020 research and innovation programme under the FlexPlan project (grant agreement no. 863819).
Developed by:
- Hakan Ergun (KU Leuven / EnergyVille)
- Matteo Rossini (RSE)
- Marco Rossi (RSE)
- Damien Lepage (N-Side)
- Iver Bakken Sperstad (SINTEF)
- Espen Flo Bødal (SINTEF)
- Merkebu Zenebe Degefa (SINTEF)
- Reinhilde D'Hulst (VITO / EnergyVille)
The developers thank Carleton Coffrin (Los Alamos National Laboratory) for his countless design tips.
## Citing FlexPlan.jl
If you find FlexPlan.jl useful in your work, we kindly request that you cite the following [publication](https://doi.org/10.1109/osmses58477.2023.10089624) ([preprint](https://doi.org/10.5281/zenodo.7705908)):
```bibtex
@inproceedings{FlexPlan.jl,
author = {Matteo Rossini and Hakan Ergun and Marco Rossi},
title = {{FlexPlan}.jl – An open-source {Julia} tool for holistic transmission and distribution grid planning},
booktitle = {2023 Open Source Modelling and Simulation of Energy Systems ({OSMSES})},
year = {2023},
month = {mar},
publisher = {{IEEE}},
doi = {10.1109/osmses58477.2023.10089624},
url = {https://doi.org/10.1109/osmses58477.2023.10089624}
}
```
## License
This code is provided under a [BSD 3-Clause License](https://github.com/Electa-Git/FlexPlan.jl/blob/master/LICENSE.md).
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 145 | # Installation of FlexPlan
From Julia, FlexPlan can be installed using the built-in package manager:
```julia
using Pkg
Pkg.add("FlexPlan")
```
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 753 | # Modeling assumptions
## Multiple-year models
When preparing data for problems spanning a multi-year horizon (here the word _year_ indicates an investment period: different investment decisions can be made in different years), investment candidates must adhere to the following two assumptions:
1. If a candidate exists in a year, then it exists in all subsequent years and is defined in the same row of the corresponding table in the input data files.
2. Each candidate has the same parameters in all the years in which it exists, except for the cost which may vary with the years.
These assumptions are used not only when parsing input data files, but also in some variables/constraints where an investment candidate must be tracked along years.
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 1108 | # Network formulations
Two different network formulations have been used in the FlexPlan package:
- `PowerModels.DCPPowerModel` is a linearised 'DC' power flow formulation that represents meshed AC/DC transmission networks;
- `FlexPlan.BFARadPowerModel` is a linearised 'DistFlow' formulation that represents radial AC distribution networks.
For the comprehensive formulation of the network equations, along with the detailed model for storage and demand flexibility, the readers are referred to the FlexPlan deliverable 1.2 ["Probabilistic optimization of T&D systems planning with high grid flexibility and its scalability"](https://flexplan-project.eu/wp-content/uploads/2022/08/D1.2_20220801_V2.0.pdf)
```
@article{ergun2022probabilistic,
title={Probabilistic optimization of T\&D systems planning with high grid flexibility and its scalability},
author={Ergun, Hakan and Sperstad, Iver Bakken and Espen Flo, B{\o}dal and Siface, Dario and Pirovano, Guido and Rossi, Marco and Rossini, Matteo and Marmiroli, Benedetta and Agresti, Valentina and Costa, Matteo Paolo and others},
year={2022}
}
```
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 6315 | # Problem types
The FlexPlan.jl package contains the following problem types:
## T(D)NEP problem with storage candidates
This problem solves the AC/DC grid TNEP problem considering existing and candidate storage candidates. As such, starting from an AC / (DC) network with existing storage devices, the optmisation problem finds the best AC and DC grid investments as well as storage investments. The objective function is defined as follows:
Sets:
```math
\begin{aligned}
bc \in BC &- \text{Set of candidate AC lines} \\
dc \in DC &- \text{Set of candidate DC lines} \\
cc \in CC &- \text{Set of candidate DC converters} \\
sc \in SC &- \text{Set of candidate storage devices} \\
g \in G &- \text{Set of candidate DC converters} \\
t \in T &- \text{Set of planning hours} \\
y \in Y &- \text{Set of planning years} \\
\end{aligned}
```
Variables & parameters:
```math
\begin{aligned}
\alpha_{bc, y} &- \text{Binary investment decision variable of candidate AC line bc} \\
\alpha_{dc, y} &- \text{Binary investment decision variable of candidate DC line dc}\\
\alpha_{cc, y} &- \text{Binary investment decision variable of candidate DC converter cc}\\
\alpha_{sc, y} &- \text{Binary investment decision variable of candidate storage sc} \\
P_{g} &- \text{Active power output of generator g} \\
C_{bc, y} &- \text{Investment cost of candidate AC line bc}\\
C_{dc, y} &- \text{Investment cost of candidate DC line dc} \\
C_{cc, y} &- \text{Investment cost of candidate DC converter cc}\\
C_{sc, y} &- \text{Investment cost of candidate storage sc} \\
\end{aligned}
```
```math
min~\sum_{y \in Y} \left[ \sum_{bc \in BC} C_{bc}\alpha_{bc, y} + \sum_{dc \in BC} C_{dc}\alpha_{dc, y} + \sum_{cc \in CC} C_{cc}\alpha_{cc, y} + \sum_{sc \in BC} C_{sc}\alpha_{sc, y} + \sum_{t \in T}~ \sum_{g \in G} C_{g,t,y}P_{g,t,y} \right]
```
The problem is defined both for transmission networks, using the linearised 'DC' power flow model as well as radial distribution grids using the linearised 'DistFlow' formulation. The problem can be solved using the following function:
```julia
result_tnep = FlexPlan.strg_tnep(data, PowerModels.DCPPowerModel, solver; setting)
result_dnep = FlexPlan.strg_tnep(data, FlexPlan.BFARadPowerModel, solver; setting)
```
## TNEP problem with storage candidates and demand flexibility (Flexible T(D)NEP)
This problem solves the AC/DC grid TNEP problem considering existing and candidate storage candidates as well demand flexibility. As such, starting from an AC / (DC) network with existing storage devices, the optmisation problem finds the best AC and DC grid investments as well as storage and demand flexibility investments. The objective function is defined in addition to the TNEP problem with storage candidates as follows:
```math
min~\sum_{y \in Y} \left[ \sum_{bc \in BC} C_{bc}\alpha_{bc, y} + \sum_{dc \in BC} C_{dc}\alpha_{dc, y} + \sum_{cc \in CC} C_{cc}\alpha_{cc, y} + \sum_{sc \in BC} C_{sc}\alpha_{sc, y} + \sum_{t \in T}~ \sum_{g \in G} C_{g,t,y}P_{g,t,y} + \sum_{t \in T}~ \sum_{fc \in FC} \left( C_{fc,t,y}^{up}P_{fc,t,y}^{up} + C_{fc,t,y}^{down}P_{fc,t,y}^{down} + C_{fc,t,y}^{red}P_{fc,t,y}^{red} + C_{fc,t,y}^{curt}P_{fc,t,y}^{curt} \right)\right]
```
Sets:
```math
\begin{aligned}
fc \in FC &- \text{Set of demand flexibility investments} \\
\end{aligned}
```
Variables & parameters:
```math
\begin{aligned}
\alpha_{fc, y} &- \text{Binary investment decision variable for demand flexibility} \\
P_{fc}^{up} &- \text{Upwards demand shifting for flexible demand fc} \\
P_{fc}^{down} &- \text{Downwards demand shifting for flexible demand fc} \\
P_{fc}^{red} &- \text{Demand reduction for flexible demand fc} \\
P_{fc}^{curt} &- \text{Demand curtailment for flexible demand fc} \\
C_{fc}^{up} &- \text{Cost of upwards demand shifting for flexible demand fc} \\
C_{fc}^{down} &- \text{Cost of downwards demand shifting for flexible demand fc} \\
C_{fc}^{red} &- \text{Cost of voluntarydemand reduction for flexible demand fc} \\
C_{fc}^{curt} &- \text{Cost of involuntary demand curtailment for flexible demand fc} \\
\end{aligned}
```
The problem is defined both for transmission networks, using the linearised 'DC' power flow model as well as radial distribution grids using the linearised 'DistFlow' formulation. The problem can be solved using the following function:
```julia
result_tnep = FlexPlan.flex_tnep(data, PowerModels.DCPPowerModel, solver; setting)
result_dnep = FlexPlan.flex_tnep(data, FlexPlan.BFARadPowerModel, solver; setting)
```
Additionally, this particular problem can also be solved for both transmission and distribution networks combined, using specific data for both the transmission and the distribution network:
```julia
result_t_and_d_nep = FlexPlan.flex_tnep(t_data, d_data, PowerModels.DCPPowerModel, FlexPlan.BFARadPowerModel, solver; setting)
```
## Stochastic flexbile T(D)NEP
This problem type extends the multi-year, multi-hour planning problem for a number of scenarios, e.g., variations of the planning year, and optimizes the investments taking into account the explicit scenario probabilities. As such, the objective is extended as follows, w.r.t. to the flexbile T(D)NEP problem:
Sets:
```math
\begin{aligned}
s \in S &- \text{Set of planning scearios} \\
\end{aligned}
```
Parameters:
```math
\begin{aligned}
\pi_{s} &- \text{Probability of scenario s} \\
\end{aligned}
```
```math
min~\sum_{s \in S} \pi_{s} \left\{ \sum_{y \in Y} \left[ \sum_{bc \in BC} C_{bc}\alpha_{bc, y} + \sum_{dc \in BC} C_{dc}\alpha_{dc,y} + \sum_{cc \in CC} C_{cc}\alpha_{cc,y} + \sum_{sc \in BC} C_{sc}\alpha_{sc,y} + \sum_{t \in T}~ \sum_{g \in G} C_{g,t,y,s}P_{g,t,y,s} + \sum_{t \in T}~ \sum_{fc \in FC} \left( C_{fc,t,y,s}^{up}P_{fc,t,y,s}^{up} + C_{fc,t,y,s}^{down}P_{fc,t,y,s}^{down} + C_{fc,t,y,s}^{red}P_{fc,t,y,s}^{red} + C_{fc,t,y,s}^{curt}P_{fc,t,y,s}^{curt} \right)\right] \right\}
```
The problem is defined both for transmission networks, using the linearised 'DC' power flow model as well as radial distribution grids using the linearised 'DistFlow' formulation. The problem can be solved using the following function:
```julia
result_tnep = FlexPlan.stoch_flex_tnep(data, PowerModels.DCPPowerModel, solver; setting)
result_dnep = FlexPlan.stoch_flex_tnep(data, FlexPlan.BFARadPowerModel, solver; setting)
```
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"BSD-3-Clause"
] | 0.4.0 | 683a1f0c0223ebedf1a7212f4507051f9b479e93 | docs | 9085 | # Tutorial
This page shows how to define and solve network planning problems using FlexPlan.
!!! tip
Before following this tutorial you might want to have a look at some [examples](@ref Examples).
## 1. Import packages
FlexPlan builds on [PowerModels](https://github.com/lanl-ansi/PowerModels.jl) and [PowerModelsACDC](https://github.com/Electa-Git/PowerModelsACDC.jl) packages.
You can declare the packages as follows, and use short names to access specific functions without having to type the full package name every time.
```julia
import PowerModels as _PM
import PowerModelsACDC as _PMACDC
import FlexPlan as _FP
```
Any other additional package that you might need, e.g., for printing, plotting, exporting results etc. can be declared in the same way.
Also, the solution of the optimization problem will require an MILP solver.
As FlexPlan depends on [JuMP](https://github.com/jump-dev/JuMP.jl) package, it can be interfaced with any of the [optimisation solvers supported by JuMP](https://jump.dev/JuMP.jl/stable/installation/#Supported-solvers).
You can declare and initialize the solver as follows:
```julia
import HiGHS
optimizer = _FP.optimizer_with_attributes(HiGHS.Optimizer, "output_flag"=>false)
```
!!! tip
FlexPlan exports `JuMP.optimizer_with_attributes` function, so you don't have to import JuMP just to use this function.
## 2. Input data
The data model of FlexPlan is very similar to the ones of PowerModels/PowerModelsACDC.
As such, a data dictionary containing all information is passed to the optimisation problem.
The standard network elements such as generators, buses, branches, etc. are extended with the existing and candidate storage and demand flexibility elements (see section [Data model](@ref) for complete description).
The multi-network modelling functionality of PowerModels is used to represent the different number of scenarios, planning years and planning hours within the year.
The procedure is further explained under section [Model dimensions](@ref).
### FlexPlan.jl sample data
The package contains some sample test cases comprising both grid data and multi-scenario time series, located under [`/test/data/`](https://github.com/Electa-Git/FlexPlan.jl/tree/master/test/data) and named as its subdirectories.
These test cases have been used in for the validation of the model in the FlexPlan deliverable 1.2 ["Probabilistic optimization of T&D systems planning with high grid flexibility and its scalability"](https://flexplan-project.eu/wp-content/uploads/2022/08/D1.2_20220801_V2.0.pdf).
[`/test/io/load_case.jl`](https://github.com/Electa-Git/FlexPlan.jl/blob/master/test/io/load_case.jl) provides functions to load such test cases.
The functions are named `load_*` where `*` is the name of a test case.
For example, `case6` can be loaded using:
```julia
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/load_case.jl"))
data = load_case6(; number_of_hours=24, number_of_scenarios=1, number_of_years=1)
```
Supported parameters are explained in `load_*` function documentation: in a Julia REPL, type `?` followed by a function name to read its documentation.
### Using your own data
FlexPlan provides functions that facilitate the construction of a multinetwork data dictionary using:
- network data from Matpower-like `.m` files;
- time series data from dictionaries of vectors, each vector being a time series.
The procedure is as follows.
1. Create a single-network data dictionary.
1. Load network data from Matpower-like `.m` files (see e.g. [`/test/data/case6/case6_2030.m`](https://github.com/Electa-Git/FlexPlan.jl/blob/master/test/data/case6/case6_2030.m)). Use `parse_file`.
2. Specify the dimensions of the data. Use `add_dimension!`.
3. Scale costs and lifetime of grid expansion elements. Use `scale_data!`.
2. Create a dictionary of vectors that contains time series. You have to write your own code for performing this step.
3. Create a multinetwork data dictionary by combining the single-network data dictionary and the time series. Use `make_multinetwork`.
Here is some sample code to get started:
```julia
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
sn_data = _FP.parse_file(joinpath(_FP_dir,"test/data/case6/case6_2030.m"))
_FP.add_dimension!(sn_data, :hour, 24)
_FP.add_dimension!(sn_data, :scenario, Dict(1 => Dict{String,Any}("probability"=>1)))
_FP.add_dimension!(sn_data, :year, 1; metadata = Dict{String,Any}("scale_factor"=>1))
_FP.scale_data!(sn_data)
include(joinpath(_FP_dir,"test/io/create_profile.jl")) # Functions to load sample time series. Use your own instead.
sn_data, loadprofile, genprofile = create_profile_data_italy!(sn_data)
time_series = create_profile_data(24, sn_data, loadprofile, genprofile) # Your time series should have the same format as this `time_series` dict
mn_data = _FP.make_multinetwork(sn_data, time_series)
```
### Coupling of transmission and distribution networks
FlexPlan provides the possiblity to couple multiple radial distribution networks to the transmission system, for solving the combined T&D grid expansion problem.
For the meshed transmission system the linearized 'DC' power flow formulation is used, whereas radial networks are modelled using the linearized DistFlow model (more information can be found under [Network formulations](@ref) section).
Input data consist of:
- one dictionary for the trasmission network;
- a vector of dictionaries, each item representing one distribution network.
The only difference with respect to the case of a single network is that for each distribution network it is necessary to specify which bus of the transmission network it is to be attached to.
This is done by adding a `t_bus` key in the distribution network dictionary.
Here is an example (using FlexPlan sample data):
```julia
number_of_hours = 4
number_of_scenarios = 2
number_of_years = 1
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir, "test/io/load_case.jl"))
# Transmission network data
t_data = load_case6(; number_of_hours, number_of_scenarios, number_of_years)
# Distribution network 1 data
d_data_sub_1 = load_ieee_33(; number_of_hours, number_of_scenarios, number_of_years)
d_data_sub_1["t_bus"] = 3 # States that this distribution network is attached to bus 3 of transmission network
# Distribution network 2 data
d_data_sub_2 = deepcopy(d_data_sub_1)
d_data_sub_2["t_bus"] = 6
d_data = [d_data_sub_1, d_data_sub_2]
```
## 3. Solve the problem
Finally, the problem can be solved using (example of stochastic planning problem with storage & demand flexiblity candidates):
```julia
result = _FP.simple_stoch_flex_tnep(data, _PM.DCPPowerModel, optimizer; setting=Dict("conv_losses_mp"=>false))
```
of, for the combined T&D model:
```julia
result = _FP.simple_stoch_flex_tnep(t_data, d_data, _PM.DCPPowerModel, _FP.BFARadPowerModel, optimizer; t_setting=Dict("conv_losses_mp"=>false))
```
For other possible problem types and decomposed models, please check the [Problem types](@ref) section.
## 4. Inspect your results
The optimization results are returned as a Julia `Dict`, so you can easily write your custom code to retrieve the results you need.
However some basic functions for displaying and exporting results are provided in the package.
### Check termination status
First thing to do is check the [termination status](https://jump.dev/JuMP.jl/stable/moi/manual/solutions/) of the solver to make sure that an optimal solution has been found.
You may want to check the value of `"termination_status"` like this:
```julia
@assert result["termination_status"] ∈ (_FP.OPTIMAL, _FP.LOCALLY_SOLVED) "$(result["optimizer"]) termination status: $(result["termination_status"])"
```
!!! tip
FlexPlan exports JuMP's `TerminationStatusCode` and `ResultStatusCode`, so you can access these types as above, without having to import JuMP just for that.
### Check solve time
The total solve time is also available, under `"solve_time"`:
```julia
println("Total solve time: $(round(result["solve_time"], digits=1)) seconds.")
```
### Access solution
To obtain power flow results, you can use the `print_summary` and `component_table` functions of PowerModels.
Further, several functions are provided to access to optimal investments and costs by category, view power profiles, and display the network topology.
Generally, they return a [DataFrame](https://dataframes.juliadata.org/stable/).
They also allow you to save numerical results as CSV files and plots in any format supported by [Plots.jl](https://docs.juliaplots.org/stable/).
All these functions are contained in [`/test/io/sol.jl`](https://github.com/Electa-Git/FlexPlan.jl/blob/master/test/io/sol.jl), but are not part of FlexPlan module to avoid unwanted dependencies.
Include them with
```julia
const _FP_dir = dirname(dirname(pathof(_FP))) # Root directory of FlexPlan package
include(joinpath(_FP_dir,"test/io/sol.jl"))
```
and import the required packages.
| FlexPlan | https://github.com/Electa-Git/FlexPlan.jl.git |
|
[
"MIT"
] | 0.3.0 | 5c09f5af273dcee88449ab5a08c5aaa2102cd251 | code | 1641 | using Strategems, Temporal, Indicators, Plots
# define universe and gather data
assets = ["EOD/AAPL", "EOD/MCD", "EOD/JPM", "EOD/MMM", "EOD/XOM"]
universe = Universe(assets)
function datasource(asset::String; save_downloads::Bool=true)::TS
path = joinpath(dirname(pathof(Strategems)), "..", "data", "test", "$asset.csv")
if isfile(path)
return Temporal.tsread(path)
else
X = quandl(asset)
if save_downloads
if !isdir(dirname(path))
mkdir(dirname(path))
end
Temporal.tswrite(X, path)
end
return X
end
end
gather!(universe, source=datasource)
# define indicator and parameter space
function fun(x::TS; args...)::TS
close_prices = x[:Adj_Close]
moving_average = sma(close_prices; args...)
output = [close_prices moving_average]
output.fields = [:Adj_Close, :MA]
return output
end
indicator = Indicator(fun, ParameterSet([:n], [50], [10:5:200]))
# define signals
longsignal = @signal Adj_Close ↑ MA
shortsignal = @signal Adj_Close ↓ MA
# define trading rules
longrule = @rule longsignal → buy 100
shortrule = @rule shortsignal → liquidate 100
# construct and test the strategy
strat = Strategy(universe, indicator, (longrule, shortrule))
backtest!(strat, px_trade=:Adj_Open, px_close=:Adj_Close)
weights, holdings, values, profits = summarize_results(strat)
# plot(holdings, layout=(length(assets),1), color=(1:length(assets))')
# plot(weights[:,1:length(assets)], layout=(length(assets),1), color=(1:length(assets))')
# plot(cumsum(profits), layout=(fld(length(assets)+1,2),2), color=(1:length(assets)+1)')
| Strategems | https://github.com/dysonance/Strategems.jl.git |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.