Finding Clique using graph and vertices - python

PYTHON ONLY!!
I have a graph
graph = [[0,1,1,1,0],
[1,0,0,0,0],
[0,1,0,1,1],
[1,0,1,0,1],
[0,0,1,1,0]]
The function clique(graph, vertices) should take an adjacency matrix
representation of a graph, a list of one or more vertices, and return a boolean True if the vertices create a clique
(every person is friends with every other person), otherwise return False.
`def clique(graph, vertices)`
I want to find out whether does a clique exists in the graph above
If yes the output should be True, otherwise False
eg. 'clique', (graph,[2,3,4]), True)]
Explanation needed thanks!

https://en.wikipedia.org/wiki/Clique_problem
Here you go, depending on what problem you ACTUALLY want to solve, here you have a starting point for finding an algorithm.
Is a graph a clique: Just check that all nodes are adjacent to each other.
Does it contain a clique? Always true if the graph is non-empty because a single vertex is already a clique of size one.
Does it contain a clique of size k? brute force it
Find a single maximal clique? Greedy algorithm possible as described in the link.
Find all maximal cliques? see wikipedia page for references (this is hard)

Related

How to generate all directed permutations of an undirected graph?

I am looking for a way to generate all possible directed graphs from an undirected template. For example, given this graph "template":
I want to generate all six of these directed versions:
In other words, for each edge in the template, choose LEFT, RIGHT, or BOTH direction for the resulting edge.
There is a huge number of outputs for even a small graph, because there are 3^E valid permutations (where E is the number of edges in the template graph), but many of them are duplicates (specifically, they are automorphic to another output). Take these two, for example:
I only need one.
I'm curious first: Is there is a term for this operation? This must be a formal and well-understood process already?
And second, is there a more efficient algorithm to produce this list? My current code (Python, NetworkX, though that's not important for the question) looks like this, which has two things I don't like:
I generate all permutations even if they are isomorphic to a previous graph
I check isomorphism at the end, so it adds additional computational cost
Results := Empty List
T := The Template (Undirected Graph)
For i in range(3^E):
Create an empty directed graph G
convert i to trinary
For each nth edge in T:
If the nth digit of i in trinary is 1:
Add the edge to G as (A, B)
If the nth digit of i in trinary is 2:
Add the edge to G as (B, A)
If the nth digit of i in trinary is 0:
Add the reversed AND forward edges to G
For every graph in Results:
If G is isomorphic to Results, STOP
Add G to Results

How to improce the performance of this algorithm?

I have the following algorithm:
I have a graph and a related I have a topological sorting (In graph theory, "a topological sort or topological ordering of a directed graph is a linear ordering of its vertices such that for every directed edge uv from vertex u to vertex v, u comes before v in the ordering. ").
Given a start_position and an end_position (different from the start_one), I want to verify if shifting the element of the list that is at start_position to the end_position preserves the topological order, i.e, if after the shifting i still have a topological order.
There are two cases : left_shift (if start_position > end_position) and right_shift (otherwise).
Here is my attempt:
def verify(from_position:int, to_position:int, node_list:List[str], instance:pb.Problem):
if from_position < to_position :
#right-shift
for task_temp in node_list[from_position+1:to_position+1]:
if (node_list[from_position],task_temp) in instance.all_predecessors:
return False
return True
if to_position < from_position :
#end_shift
for task_temp in node_list[to_position:from_position]:
if (task_temp, node_list[from_position]) in instance.all_predecessors:
return False
return True
PS: all_predecessors are a set of tuples (2 elements) that has all the edges of the graph.
Is there a way to make it faster?
The naive approach is asymptotically optimal: Just run through the (new) ordering and verify that it satisfies the topological criteria. You can do this by maintaining a bitfield of the nodes encountered so far, and check that each new node’s predecessors are set in the bitfield. This takes linear time in the number of nodes and edges, which any correct algorithm will need in the worst case.
For other variants of the problem (e.g. measuring in the size of the shift, or optimizing per-query time after preprocessing) there might be better approaches.

How to Search 100,000 Edges Fast?

I have a network DirectedGraph object G that contains around 2,0000 vertices and 120,000 edges in it.
I'd like to search the edge list and check which edge ends in the vertex 'deny'.(Oh the graph vertices are all an English word.)
I've just stupidly does as following, but it never stops.. I am waiting more than 10 mins from the very beginning.
How could I perform it fast?
for i in range(len(G.edges())):
if list(G.edges())[i][1] == 'deny':
print(list(G.edges())[i])
You can have dictionary of <end vertex, (start vertex, end)>. However, it depends on how frequent these operations would be performed.
If it's just one time operation then I would just go about looping my list otherwise implement the dictionary.

Bitwise operations between every pair of elements in array

Here is the situation:
I have a graph type structure, an adjacency list, and each element of this adjacency list is a 1 dimensional array (either numpy, or bcolz.. not sure if I will use bcolz or not).
Each 1-dimensional array represents graph elements that could possibly connect, in the form of binary sequences. For them to connect, they need to have a specific bitwise intersection value.
Therefore, for each 1 dimensional array in my adjacency list, I want to do the bitwise "and" between every combination of two elements in the given array.
This will possibly be used for huge graph breadth-first traversal, so we may be talking a very very large number of elements.
Is this something I can do with vectorized operations? Should I be using a different structure? What is a good way to do this? I am willing to completely restructure everything if there could be a significant performance boost.
Is it as simple as looping through the individual elements and then broadcasting(correct terminology?) & against the entire array? Thanks.
quick edit
As an extra note, I am using python integers for my byte sequences. Which from my understanding, doesn't play well with numpy(the integers get too big, type long long). I have to create arrays of object type. Does this potentially cause a huge slowdown? Is it a reason to use a different structure?
An Example
//create an nxn adjacency list, where n is number of graph nodes.
//map each graph node to a value 2^k:
nodevals = defaultdict()
for i in xrange(n):
nodevals[i] = 2**(i+1)
//for each edge in our graph it is comprised of two nodes, which are mapped as powers of two. Take their sum, and place them in the adjacency list:
for i in xrange(n):
for j in xrange(n):
adjlist[i][j].append((nodevals[i]|nodevals[j]))
//We now have our first Adjacency list, which is just bare edges. These edges can be connected by row or column, by taking intersection of the sum (nodevals[i]|nodevals[j]) with the other edge (nodevals[i2]|nodevals[j2]), and checking if it equals the connection point for each.
//this may not seem useful for individual edges, but in future iterations we can do this:
//After 3 iterations. (5,1) connected to (1,9), and then this connected to (7,5), for example:
adjlist[5][1] & adjlist[1][9] == 1
adjlist2[5][9] == adjlist[5][1]|adjlist[1][9]
adjlist[7][5] & adjlist2[5][9] == 5
adjlist3[7][9] == adjlist[7][5]|adjlist2[5][9]
//So, you may see how this could be useful for efficient traversal.
//However, it becomes more complicated, because as we increase the length of our subpaths, or "pseudo-edges", or whatever you want to call them,
//The arrays for the given (i,j) hold more and more subpath sums that can potentially be connected.
//Soon the arrays can become very large, which is when I would want to efficiently be able to calculate intersections
//AND, for this problem in particular, I want to be able to connect edges against the SAME edge, so I want to do the bitwise intersection between all pairs of elements in the given array (ie, the given indices [i][j] of that adjacency list).

IGraph python get neighbour Vetrices from Vertex

I have a graph and I want to implement a modification of the Page Rank algorithm. I am stuck on the following point. I don't know how to get all the neighboring vertices from a node.
Currently I am receiving the list of the edges using:
g.incident("a", mode="out")
This returns me the list of the edges indexes.
How can I get the vertex name from that?
For example I need to know that "a" is linked to "b" and "d"
g.neighbors("a", mode="out") will give you the vertex indices for the neighbors. You can then get the names as follows:
>>> neis = g.neighbors("a", mode="out")
>>> g.vs[neis]["name"]
But actually, if I were you, I would try to work with vertex indices as much as possible because it's way faster to work with the indices than with the names.

Categories